Mobile app version of vmapp.org
Login or Join
Michele947

: Do or can robots cause considerable performance issues? At work we are in a discussion with team members who seem to think bots will cause us problems relating to performance when running on

@Michele947

Posted in: #Googlebot #Performance #RobotsTxt #Seo #WebCrawlers

At work we are in a discussion with team members who seem to think bots will cause us problems relating to performance when running on our services website.

Our setup:
Lets say I have site mysite.co.uk this is a shop window to our online services which sit on mysiteonline.co.uk.
When people search in Google for mysite they see mysiteonline.co.uk as well as mysite.co.uk.

Cases against stopping bots crawling:


We don't store gb's of data publicly available on the web
Most friendly bots, if they were to cause issues would have done so already
In our instance the bots can't crawl the site because it requires username & password
Stopping bots with robot .txt causes an issue with seo (ref.1)
If it was a malicious bot, it would ignore robot.txt or meta tags anyway


Ref 1.
If we were to block mysiteonline.co.uk from having robots crawl this will affect seo rankings and make it inconvenient for users who actively search for mysite to find mysiteonline. Which we can prove is the case for a good portion of our users.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Michele947

1 Comments

Sorted by latest first Latest Oldest Best

 

@Sarah324

If the majority of your content is behind a login then bots won't have access to it which will limit the amount of time they spend on your site.
Good bots, like those from major search engines, will automatically set their crawl rate to one appropriate for the site. Although there are some stories of bots hammering sites they tend to be older and you don't hear much about that happening anymore.
For at least Google (and maybe Bing but I never use their webmaster tools) you can set the crawl rate for your site so if you find you are getting hammered you can actually tell them to slow it down
Blocking your content using robots.txt is SEO suicide. If they can't crawl it, they can't index it. If they can't index it, they can't show it in the results. If they can't show your pages in their results your users can't find you.
Bad bots are a problem every webmaster faces sooner or later. And they will ignore a robots.txt file. They need to be dealt will individually and this website contains plenty of solutions for doing so.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme