Mobile app version of vmapp.org
Login or Join
Jamie184

: Google-SEO-friendly ajax site since Oct 2015 Since Oct 2015, Google have been able to crawl ajax sites automatically (so the recommendations made in former articles like this are officially deprecated).

@Jamie184

Posted in: #Ajax #CrawlableAjax #GoogleAnalytics #GoogleSearch #GoogleSearchConsole

Since Oct 2015, Google have been able to crawl ajax sites automatically (so the recommendations made in former articles like this are officially deprecated).

But I'm still having trouble getting Google to recognize new pages. My homepage, example.com, links to pages like example.com/?pageID=1234
The links to those pages exist on example.com in the form of javascript catching the .on('mousedown', function() {}) event on certain elements and doing this on a high level:

//1) load some further html with jquery:
$('#div').load("aPage.html")

//2) update the url history and document title
window.history.pushState("string", "Title", "?pageID=" + pageIDVariable);
document.title = titleOfPageVariable";

//3) push a new page into google analytics...
ga('set', {page: '/?pageID=' + pageIDVariable,
title: titleOfPageVariable});


But after being live a little while, googling example.com returns nothing other than my homepage, and using the Google webmaster search console the "Search Traffic -> Search Analytics" section shows 0 impressions, 0 clicks, etc. So Google simply hasn't indexed those pages yet (which is also confirmed in the Index Report in the webmaster tools).

Whereas Google Analytics shows the pages I'm expecting (under Behavior -> Site Content -> All Pages).

Is there something fundamentally wrong with my setup? Or perhaps it's just a matter of time before Google will start indexing individual pages.

I haven't set up any "URL parameters" in Google's webmaster tools since I didn't think it was needed in this case.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Jamie184

1 Comments

Sorted by latest first Latest Oldest Best

 

@Si4351233

The first thing to check with this is to make sure that Google can actually access your CSS and javascript files. They need access so that they can see what the javascript and CSS is actually doing to the page as links are clicked. Think of it as whatever the end user's browser needs to access also needs to be accessed by Google. Many sites still have static content blocked in their robots.txt file so as the script files and stylesheets are not indexed.

Second thing to check is if any errors at all are being thrown in your javascript console. While the script may work in the browser if errors are being thrown that may be causing an issue with the crawlers.

You say after being live for a little while but don't define how long a little while is. Google can take anywhere from a few hours to 2+ weeks to recrawl a site. If you have a sitemap then you can update it and resubmit it to queue up a re-crawl but when it gets done will be decided by Google's all important queuing algorithms. Another option is to do a fetch as Google to see how Google see's the pages and to try fetching a few of the javascript triggered pages. I could see from your code that you push URL parameters to the address when doing the javascript so conversely I am assuming that you can also detect those URL's in your javascript in case someone goes to that specific page rather than accessing it through a link. Try accessing in those ways as well and see if the page looks the way it is expected to with the correct content being shown and no errors being thrown.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme