Mobile app version of vmapp.org
Login or Join
Debbie626

: Can a website require search engine robots to support cookies? Sometimes I pretend to be a search engine robot to get access to "subscriber only" web pages, or just to get a plain and simple

@Debbie626

Posted in: #Cookie #RobotsTxt #SearchEngines #WebCrawlers

Sometimes I pretend to be a search engine robot to get access to "subscriber only" web pages, or just to get a plain and simple view of a page that isn't cluttered with moving animations and other irrelevant crap.

Lately I have noticed that some web sites require cookies to be turned on, even if you are a bot. What is the purpose of this? Why would a search engine robot need to accept cookies? Or is this just because the guy who designed the web site screwed up?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Debbie626

2 Comments

Sorted by latest first Latest Oldest Best

 

@Eichhorn148

There are two other possibilities that spring to mind:


Websites can tell that you are pretending to be Googlebot. Google issued a procedure for verifying Googlebot that these websites might be using. They may be requiring cookies only because you are not really Googlebot
The sites don't care about having their pages included in the search indexes. When you don't care about being indexed, it doesn't really matter if you treat the search bots badly.

10% popularity Vote Up Vote Down


 

@Sarah324

Tracking comes to mind. Cookies would enable the site to ID a bot as it crawls the site and allow them to separate it from human traffic when analyzing browsing habits.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme