Mobile app version of vmapp.org
Login or Join
Kevin317

: Should I block bots from my site and why? My logs are full of bot visitors, often from Eastern Europe and China. The bots are identified as Ahrefs, Seznam, LSSRocketCrawler, Yandex, Sogou and

@Kevin317

Posted in: #Seo #UserAgent #WebCrawlers

My logs are full of bot visitors, often from Eastern Europe and China.
The bots are identified as Ahrefs, Seznam, LSSRocketCrawler, Yandex, Sogou and so on.
Should I block these bots from my site and why?

Which ones have a legitimate purpose in increasing traffic to my site?
Many of them are SEO.

I have to say I see less traffic if anything since the bots have arrived in large numbers.

It would not be too hard to block these since they all admit in their User Agent that they are bots.

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Kevin317

2 Comments

Sorted by latest first Latest Oldest Best

 

@Turnbaugh106

While attempting to block bots can help in free up resources and clean up your logs it’s important to note that robots.txt and even using the meta tag on pages noindex does not actually stop bots visiting your site. They can still crawl your site occasionally to see if the denied from robots has been removed. A lot of bots don’t even use a user agent and will use a standard user agent. The bots I’m referring to are typically SEO harvesting bots that scan for backlinks and not the general ones you find from search engines.

Rather than blocking the bots you should just factor in these bots when counting up your visitors, after a while of actively monitoring your site your establish a rough figure which are bots. Most people care about unique visits and this rules out the bots since they are constantly returning. In this day and age there are plenty of servers, shared hosting that can handle these bots so other than pages that you don’t want indexed I see no reason why to block these types of bots. Of course you have harmful bots as well but these certainly won’t use user agent ;).

Personally I believe blocking robots is a waste of time as they don’t use that much resources at all, SEO robots can help as they list your site on PR0 pages which of course increases your PageRank and there automated so you won’t get punished by them.

Logs Issue

You should use a proper log viewer that enables you to filter out certain requests, this makes it easier when reviewing your logs. Good viewers can filter out lots of things like normal visits, 404s' and so forth.

10% popularity Vote Up Vote Down


 

@YK1175434

You can block bots but it depends on what you want for your website.

You can block search engine bots if you don't want to see your website indexed in a particular search engine.
Example: Yandex is russian search engine. You can block its bot if your business is not targeting Russia.

You can block SEO bots if you don't want to use their web analytics solution.
Example: Ahrefs is web analytics solution. You can block its bot if you don't use this web analytics solution.

Reason to block bots:


less robots go to your web site and more bandwidth is attributed to real visitors
be safe against malwares bots
logs size


Reason to not to block bots:


bots like search engine bots can increase your traffic by indexing your website.


You can learn more about bots by reading FAQ of robotstxt.org.

If you want to determine several robots to block, you can take inspiration from this website robots.txt.

Be careful, some bots can ignore robots.txt, more information here.

Conclusion: You can search on internet robots function to determine if blocking them can be useful.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme