: If I block a page in robots.txt would I still be able to see the page in my browser? What is the purpose of robots.txt? Is it to prevent bots from crawling pages? And if so, does that
What is the purpose of robots.txt? Is it to prevent bots from crawling pages? And if so, does that mean that I can still access the page through my browser?
More posts by @Carla537
1 Comments
Sorted by latest first Latest Oldest Best
Yes, you can still access the page through your browser. robots.txt is a suggestion to search engines about pages or directories you would like them not to crawl. Most major search engines such as Google honor these requests, but there is nothing that physically forces them to or guarantees that they or any other search engine will.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.