Mobile app version of vmapp.org
Login or Join
Jessie594

: How do I control how often search engines visit my site? I've been using the following line in the <head> of my sites for years: <meta name="revisit-after" content="3 days" /> I recently

@Jessie594

Posted in: #Google #SearchEngines #Seo

I've been using the following line in the <head> of my sites for years:

<meta name="revisit-after" content="3 days" />

I recently discovered that it's not one of the meta tags that Google understands, which I take to mean that there's no point in including it, and that it's been doing no good at all for years.

How often do search engines crawl a website by default, and what reliable ways are there to increase or decrease that frequency?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Jessie594

2 Comments

Sorted by latest first Latest Oldest Best

 

@Pope3001725

You can control how fast Google crawls your site in Google Webmaster Tools.


Crawl rate for your site determines
the time used by Googlebot to crawl
your site on each visit. Our goal is
to thoroughly crawl your site (so your
pages can be indexed and returned in
search results!) without creating a
noticeable impact on your server's
bandwidth. While most webmasters are
fine using the default crawl setting
(i.e. no changes needed, more on that
below), some webmasters may have more
specific needs.

Googlebot employs sophisticated
algorithms that determine how much to
crawl each site it visits. For a vast
majority of sites, it's probably best
to choose the "Let Google determine my
crawl rate" option, which is the
default. However, if you're an
advanced user or if you're facing
bandwidth issues with your server, you
can customize your crawl rate to the
speed most optimal for your web
server(s). The custom crawl rate
option allows you to provide Googlebot
insight to the maximum number of
requests per second and the number of
seconds between requests that you feel
are best for your environment.

Googlebot determines the range of
crawl rate values you'll have
available in Webmaster Tools. This is
based on our understanding of your
server's capabilities. This range may
vary from one site to another and
across time based on several factors.
Setting the crawl rate to a
lower-than-default value may affect
the coverage and freshness of your
site in Google's search results.
However, setting it to higher value
than the default won't improve your
coverage or ranking. If you do set a
custom crawl rate, the new rate will
be in effect for 90 days after which
it resets to Google's recommended
value.

You may use this setting only for root
level sites and sites not hosted on a
large domain like blogspot.com (we
have special settings assigned for
them). To check the crawl rate
setting, sign in to Webmaster Tools
and visit the Settings tab. If you
have additional questions, visit the
Webmaster Help Center to learn more
about how Google crawls your site or
post your questions in the Webmaster
Help Forum.


Other then that you would probably need to create your own filtering system that sniffs out their user agents and either allows or denies search engine bots based in their user-agent. But that would only affect decreasing their frequency.

10% popularity Vote Up Vote Down


 

@Goswami781

To answer the second part of your question, you can tell Google using Webmaster Tools


To change the crawl rate:


On the Webmaster Tools Home page,
click the site you want.
Under Site
configuration, click Settings.
In the
Crawl rate section, select the option
you want.


www.google.com/support/webmasters/bin/answer.py?answer=48620&hl=en_GB
On the Bing FAQ it refers to this post which recommends setting Crawl-Delay: X where x is the number of seconds to wait between each request.

Of course sitemaps indicate how often pages change.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme