How often Google spiders crawl a site depends on many different factors. For example, a large site with regularly updated content tends to receive spider visits more often than a small static site. Some estimates the general deep crawl at once a month, but blogs and other dynamic sites can experience crawls many times a day, sometimes minutes after new content goes live.
The Google spider, also known as Googlebot, follows a series of algorithms that tell it how often a site crawls. These algorithms are kept secret by Google, although the company suggests what kind of site Googlebot might appeal to. According to Brad Hill at Dummies.com, Google spiders crawl the web on two levels: deep and fresh. A deep crawl can only happen once every 30 days. Thus, static new pages, especially those to which there are incoming hyperlinks, do not show up in the Google index for weeks. However, spiders also skim the Web in a “new crawl,” noticing new pages via links and pings.
You can’t tell Googlebot when to come back to crawl your site. However, things can be done to encourage Google spiders to return more or less often. For example, adding new content at least three times a week helps increase Google’s spider crawl rate, according to Search Engine Journal. This has the advantage of possibly helping your site rank well in Google’s search results for more search terms. Likewise, making sure you don’t have any duplicate site content and have a reliable web server for your site can help encourage frequent crawling.
The robots.txt is a simple file you can leave your server that tells Google which pages can and cannot be indexed. While this can’t directly increase the frequency of its spider streak, it can make crawls more efficient. For example, you can tell Googlebot to ignore a certain page, for example, one with very similar content. This can help avoid any crawling problems that could affect the regularity of your site visits.
Despite common misunderstandings, crawl speed reports the speed of a Googlebot crawl, not the frequency. If the Google spiders indexing your site too fast it could drain some of your bandwidth. Google then uses a computer algorithm to decide how many pages and at what speed to crawl each time it hits your site. You can change the settings in Google Webmaster Tools to increase or decrease the Googlebot crawl speed.