Mobile app version of vmapp.org
Login or Join
Murphy175

: Are Googlebot and other search engine crawlers able to detect if element is hidden after page load by JavaScript? So, we all know Google hates hidden elements, and rightly so. But I am curious,

@Murphy175

Posted in: #Googlebot #HiddenText #Javascript #WebCrawlers

So, we all know Google hates hidden elements, and rightly so. But I am curious, what if the element is present in DOM normally in the static form, but after page load, it is hidden by JavaScript?

AFAIK Googlebot (and other bots for that matter) cannot interpret JavaScript, so they should not detect this tampering, right?

And consider a case when it is not obvious keyword stuffing and spamming, but probably hidden text to be inserted in other part of the page on some relevant and solicited user event.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Murphy175

1 Comments

Sorted by latest first Latest Oldest Best

 

@BetL925

Googlebot now renders pages and views the page as a user sees it when it loads, including applying CSS and running JavaScript. Google will detect text that is hidden using either CSS or JavaScript.

Google will penalize a site for hiding keyword rich text that cannot be viewed by users. They call the practice cloaking: showing content to search engine bots, but not to users.

Note that hiding text that can be shown via user interaction will not be penalized. There are many legitimate reasons to hide text initially but show it when the user needs it. Popular interactive devices such as carousels, light boxes, and fly outs need to initially hide some content.

When text is initially hidden, but can be viewed by users through interaction, Google may choose not to index the words in that text. If it does index them, it may give them less relevance weight than words shown to users on load.

In the last few years as Googlebot has gained the ability render pages, Google has changed its guidelines regarding AJAX. It used to be that if you wanted a crawlable AJAX website you would have to implement "hash bang" URLs with HTML snapshots. Now Google can crawl AJAX websites that implement "push state" for changing URLs as the content on the page changes. Push state is much easier to implement than HTML snapshots because it doesn't require server side HTML production that mimics the client side JavaScript.

Even though Google can now crawl AJAX pretty well, if you want an AJAX site well indexed, you have to provide a different URL for every piece of content. Single page applications where the URL never changes are still not SEO friendly because Google has no way of deep linking into the content that matches searches.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme