Mobile app version of vmapp.org
Login or Join
Megan663

: How do I prevent Google from indexing text that is generated by JavaScript? Google is executing the JavaScript on my password creator website and indexing the random passwords that get generated

@Megan663

Posted in: #Google #Googlebot #Indexing #Javascript

Google is executing the JavaScript on my password creator website and indexing the random passwords that get generated by the JavaScript.



The random text that is generated could look spammy to Google and I'm afraid that it will hurt rankings. It is text that only Googlebot will see (users see their own generated text) so in a sense it is cloaking. I use some common words as password suggestions and there is a chance they could come out in the order of a popular search phrase. Also, every time Googlebot renders the page, it will get different text on it, so my page will end up looking like it changes way more frequently than it actually does.

The passwords are also showing up in the search results in the text snippet under the link for normal searches. They look ugly there, so I'd also like to prevent Google from using the generated text for the search snippets.

How do I prevent Googlebot from indexing the password content on my site? I'd like to have the rest of the content on the page indexed.

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Megan663

3 Comments

Sorted by latest first Latest Oldest Best

 

@Annie201

Is there a way you can take advantage of iframes?

Google is able to just look at the source of an iframe, but it can't crawl an iframe directly into the parent page, as far as I know.

So, the iframe source should be noindex.

10% popularity Vote Up Vote Down


 

@Correia994

robots.txt can block JavaScript files from Googlebot. www.robotstxt.org/ has more information about how to construct a robots.txt file.

You could put your JavaScript that shows the password into an external JavaScript file (called showlists.js):

$(document).ready(function(){
showLists();
});


Call that JavaScript file in the page head:

<script src="/showlists.js">


Then disallow it in robots.txt:

disallow: /showlists.js


Then Googlebot would be able to crawl the page, but would not see passwords rendered on the page.

This method is the Google approved way of blocking page elements from Googlebot that avoids cloaking. The disadvantage of this method is that it requires an external JavaScript file which can make the page slower to load.

You could test your robots.txt file with online testers such as tools.seobook.com/robots-txt/analyzer/

10% popularity Vote Up Vote Down


 

@Heady270

I created a cloaking function in JavaScript:

function isBot(){
return /bot|crawl|slurp|spider/i.test(navigator.userAgent)
}


Then I use that function to either show the passwords onload, or to show a message saying why no passwords were generated:

if (isBot()){
$('#isbot').show();
} else {
showLists();
}


Now when I use the "Fetch and Render" feature of "Fetch as Google" from webmaster tools I get the following.


This is how Googlebot saw the page:



This is how a visitor to your website would have seen the page:




This solution does involve showing different text to visitors than to Google and it could be considered to be a violation of the Google guidelines because it is technically cloaking.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme