: In addition to Google's First Click Free, should you whitelist search engine bots past a paywall? Our site has subscription-only pages - non-subscribed visitors see a snippet preview. As per Google's
Our site has subscription-only pages - non-subscribed visitors see a snippet preview. As per Google's FCF requirements, your first 5 hits to a subscriber-only pages with .google. as the referrer, you see the full page.
In addition to this, should we whitelist search engine bots so that they can index the full content? I assume this is not required for Google, which can use FCF to index our content, but what about other search engines? Is this considered cloaking?
My gut says that whitelisting bots past the paywall is bad practice., but I wanted to confirm - any evidence or references would be amazing.
More posts by @Megan663
1 Comments
Sorted by latest first Latest Oldest Best
You should only whitelist bots if you are prepared to offer visitors that come from the search engine the ability to see the content without paying. This is the first-click-free deal: You allow visitors to read some for free, Google gets to index your content, you get traffic from Google.
Search engines will get mad if you let content get crawled and indexed that is behind a paywall. If the visitors can't read it for free, they don't to crawl it and have it indexed. That is a form of cloaking.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.