Mobile app version of vmapp.org
Login or Join
Phylliss660

: How bad is cloaking for my site's search engine rankings? My website is very JavaScript-heavy. All the content of the site is loaded asynchronously using JavaScript. So my developers implemented

@Phylliss660

Posted in: #Cloaking #SearchEngines #Seo

My website is very JavaScript-heavy. All the content of the site is loaded asynchronously using JavaScript.
So my developers implemented cloaking - server sends a static version of the webpage if it's requested by Googlebot. I recently found out that is called cloaking and considered a black hat SEO technique.

I'm wondering what are the consequences?
Is there any way to measure what is the effect of using using this technique to site's search rankings?

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Phylliss660

3 Comments

Sorted by latest first Latest Oldest Best

 

@Phylliss660

You need to fix that error as soon as possible. Google is very strict about these kinds of things. It is considered a deceptive tactic by Google and they will penalize you for it. If anything put higher priority on Googlebots than actual users.

10% popularity Vote Up Vote Down


 

@Angela700

Its understandable if you're deliberately serving special static pages consisting of some kind of error to bad bots (or even guests who use content-scraping programs to copy a website), but for everyone else, just serve the same content to search engines as you would serve to normal users.

If Javascript must generate something, don't make it generate the important textual content or the HTML code, especially when its textual content that search engines can use to help you rank your pages higher. Instead, make it change the little details such as text underlining, etc.

Put users first, especially those with Javascript disabled, and treat search engine bots as normal users.

10% popularity Vote Up Vote Down


 

@Chiappetta492

You should not do this when Googlebot (or any other for that matter) comes by, but for anyone who doesn't have javascript turned on.

The most obvious check Google can do is send other set of headers to check of you give a specific result for googlebot only. if( bot-headers-result != normal-headers-result ){ punish(); }

The other reason is that the few people who legitemally do not have javascript (rare but existing) will have (some) functionality now.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme