: Is it cheating to serve different versions of the same content to users and crawlers? I have a "smart" template for my website. For human users, the server renders the page content on the
I have a "smart" template for my website. For human users, the server renders the page content on the initial hit, but subsequent pages are loaded like an SPA. For a crawler, every page is rendered at the server as in a "normal" site. For both, the final result looks the same. Since this might have some SEO consequence, is it considered cheating?
Edit:
What I'm doing is almost like this github.com/spikebrehm/isomorphic-tutorial
More posts by @Rivera981
2 Comments
Sorted by latest first Latest Oldest Best
It is against Google's guidelines to serve different pages to users. Former head of webspam Matt Cutts has went on record about this as well in an old webmaster video.
Treating Google and other bots as users
You should always treat Google exactly the same as your users, I can understand the benefits of having a seamless experience by not having to leave the current page but still having the SEO benefits of having multiple pages. There are many methods of having a website operate this way without having to detect crawlers and treat them any different.
The most biggest problem is backlinks
The biggest problem your face is backlink SEO when users are only being served one page, that means a user may not be able to copy the link of the page they are currently are on affecting both user experience and SEO.
Various methods
There are various methods that are compatible with Google from page loading JS scripts that will load the page the user wants to go to first and then serving it with a loader which can give the impression of being seemless, the other method accepted by Google is hashbang using Googles Ajax Scheme. It should be noted that users on the site using the Google Ajax compatible method should be able to backlink to what they are looking at. So serving a pretty url using hashbang and a ugly url for Google, but importantly but pages should be accessible by Google! So if you want urls without hashbang being indexed then you would use canonical which would look something like this on your hashbang urls: <link rel="canonical" href="http://example.com/not-hash-bang-url" />
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.