Mobile app version of vmapp.org
Login or Join
Annie201

: Calculating requests per second within bing In Google Search Console, I am able to set the rate at which google crawls my site and it tells me the details such as the requests per second

@Annie201

Posted in: #Bing #Google #Request #WebCrawlers

In Google Search Console, I am able to set the rate at which google crawls my site and it tells me the details such as the requests per second and seconds between requests. In Bing Webmaster tools, I can only select how high up the graph I can go when it comes to it crawling the site.

Does anyone know how to convert the crawl rate chart to the requests per second?

I ask this because I want to limit connections to my site from external sources to a low, yet reasonable amount so everyone can enjoy my site instead of bots hogging it via the slowloris DOS attack, but at the same time, I don't want search crawlers complaining to me that they can't index the site because they make too many connections.

Any idea how I can specify to bing that I want no more than x connections from them to my server?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Annie201

1 Comments

Sorted by latest first Latest Oldest Best

 

@Pope3001725

Aside from setting the "Crawl Control" settings within Bing Webmaster tools you could add a crawl-delay: directive to your robots.txt file.


Bing supports whole number values ranging from 1 – 20. Each number
maps to the length in seconds of time slices in which we divide a
24-hour crawl cycle. In this context, the value 1 means you allow us a
maximum of one request for each 1-second time slice – which is slow,
but still adequate for smaller sites. 20 is extremely slow and means
we are allowed only one request per every 20 second time slice for a
24-hour crawl cycle.


Example robots.txt entry to limit the maximum to only one request per second:

User-agent: bingbot
Crawl-delay: 1

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme