Mobile app version of vmapp.org
Login or Join
Murphy175

: Is this form of cloaking likely to be penalised? I'm looking to create a website which is considerably javascript heavy, built with backbone.js and most content being passed as JSON and loaded

@Murphy175

Posted in: #AspNet #Javascript #Seo

I'm looking to create a website which is considerably javascript heavy, built with backbone.js and most content being passed as JSON and loaded via backbone. I just needed some advice or opinions on likely hood of my website being penalised using the method of serving plain HTML (text, images, everything) to search engine bots and an js front-end version to normal users.

This is my basic plan for my site:

I plan on having the first request to any page being html which will only give about 1/4 of the page and there after load the last 3/4 with backbone js. Therefore non javascript users get a 'bit' of the experience.

Once that new user has visited and detected to have js will have a cookie saved on their machine and requests from there after will be AJAX only. Example

If (AJAX || HasJSCookie) {
// Pass JSON
}


Search Engine server content:

That entire experience of loading via AJAX will be stripped if a google bot for example is detected, the same content will be servered but all html.

I thought about just allowing search engines to index the first 1/4 of content but as I'm considered about inner links and picking up every bit of content I thought it would be better to give search engines the entire content.

I plan to do this by just detected a list of user agents and knowing if it's a bot or not.

If (Bot) {
//server plain html
}


In addition I plan to make clean URLs for the entire website despite full AJAX, therefore providing AJAX content to example.com/#/page and normal html to example.com/page is kind of our of the question. Would rather avoid the practice of using # when there are technology such as HTML 5 push state is around.

So my question is really just asking the opinion of the masses on if it's likely that my website will be penalised?

And do you suggest an alternative which avoids 'noscript' method

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Murphy175

3 Comments

Sorted by latest first Latest Oldest Best

 

@Becky754

Google actually offers a mechanism for this. Check it out here: developers.google.com/webmasters/ajax-crawling/docs/getting-started
The idea is that you can continue to offer up AJAX "injected" content on your site, but you are essentially providing a means for Google to also get at it and an API of sorts to let them know how to get it.

John Conde is very right, but I think even Google realises that Javascript is becoming essential on the web and they are introducing a means of indexing injected content.

10% popularity Vote Up Vote Down


 

@Cody1181609

its much much better If you can include a link (even small one)
or maybe one in your site map (preffered way) to the html version
instead of using a redirect meant only for the bot....

Personally i think...
The only reason to deal with the bot is a mobile version of your website.

10% popularity Vote Up Vote Down


 

@Kevin317

As long as the content is the same this is perfectly fine as cloaking is only an issue when it's used to serve the search engines different content for the purpose of manipulating your rankings.

However, you're doing it wrong. You shouldn't be looking for search engine bots to serve up static content, you should be serving up static content to everybody who doesn't use JavaScript. It's called progressive enhancement. You should be assuming that every visitor who visits your website doesn't support JavaScript. Then, if they do, enhance their experience by using JavaScript to dynamically deliver content.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme