Crawlers work by starting with a URL or group of URLs which they will download and view the content of, which they may or may not decide to add to their index. They will then visit each URL that is linked to on these pages and do the same with them. They will continue recursively visiting all of the URLs linked to on these pages as well, exponentially increasing the number of pages they need to visit.
The word crawl describes the metaphor of spiders (computer programs) crawling (browsing) the World Wide Web.
Other types of spiders may crawl your site for other reasons, such as validating site code or site structure, scraping content (downloading data or content on your site to analyze, mine, or steal and re-purpose on another's site), or even for hacking or other nefarious purposes.
Dragon Metrics uses a web crawler (named Dragonbot) to gather data about sites for the Site Audit and Site Explorer features.