Mobile app version of vmapp.org
Login or Join
Speyer207

: SEO and JavaScript since Google admits JS parsing We're planning on building a HTML snapshot creation service to provide the Google crawlers with static HTML of our JS driven single page application.

@Speyer207

Posted in: #SearchEngines #Seo

We're planning on building a HTML snapshot creation service to provide the Google crawlers with static HTML of our JS driven single page application.

Is this still necessary and/or encouraged since Google openly admits it is parsing JS now?

How should I tackle this evaluation?


Are there tools to provide data on when it's needed to provide snapshots and when google has sufficent parsing?
Is it better because it would be much faster in comparison to the JS incremental rendering?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Speyer207

2 Comments

Sorted by latest first Latest Oldest Best

 

@Kevin317

Is this still necessary and/or encouraged since Google openly admits it is parsing JS now?


Yes, it's necessary. Google is not the only webcrawler. JavaScript support for indexing isn't going to happen anytime soon.


Are there tools to provide data on when it's needed to provide snapshots and when google has sufficient parsing?


There are several tools out there for baking dynamic websites into static HTML. Some tools are designed to run in NodeJS, some in the web browser and others are offered as web services for $.

You'll need to find and use the tool that works best for your website.

The popular ones are phantomjs.org/ and zombie.labnotes.org/

10% popularity Vote Up Vote Down


 

@Bryan171

It admits to parsing JS, but it's not nearly as advanced one might think. They parse basic javascript and maybe a little bit more advanced, but no way they can detect an event, which sets a timer, which does a JSONcall, which updates styles.

If SEO is really important, stick to the current standard: No JS for SEO.
Make sure 90% of your content is reachable via the 'normal' procedures.



For a small example how to combine them: I'm working on a webshop. I wanted it to go faster, so when you change overview pages I do this via ajax. This is something a crawler cant do (the clickable options arent anchors).

To fix this, I use history.pushState() to change the url. It looks as if the page does changes the 'old' way. I think Google might detect this. But to make sure, I made the changed url work as well. So if I pushstate from /a to /b, I made sure /b is accessable if a user would type that directly.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme