: If by bots, you mean crawlers, then a simple robots.txt file should be able to correct the problem for bots that actually adhere to it. There is little that can be done to prevent those
If by bots, you mean crawlers, then a simple robots.txt file should be able to correct the problem for bots that actually adhere to it. There is little that can be done to prevent those that do not, however, except by blocking them at the network level.
The robots.txt file can be used to whitelist or blacklist crawlers or specific parts of the web site. If you have some static material (or static versions of database-driven copy) that can be served instead, you can point crawlers there without sacrificing too much from a freshness perspective.
More posts by @Sue5673885
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.