Mobile app version of vmapp.org
Login or Join
Sue5673885

: Serve different content for web crawlers to fix SEO for SPA? I have my site built as an SPA using Knockout. The main-page populates a list of items dynamically as the user scrolls and each

@Sue5673885

Posted in: #Seo #WebCrawlers

I have my site built as an SPA using Knockout. The main-page populates a list of items dynamically as the user scrolls and each item has a details-page which is also dynamically loaded. I have implemented so that each details-page do have an explicit url which is handled with the same main-page if navigated to directly.

Only now I realized all the problems with dynamically generated sites and SEO. Since all items are generated client-side, the web crawler sees basically nothing. I did an attempt with rendering links server-side to all details-pages but since they are then also generated dynamically with Knockout, a web crawler still sees nothing.

Question: Can I serve a simpler server-rendered page specifically made for web crawlers? This simple page could contain all items but without any dynamic loading, with real links to detail-pages also serving server-rendered content. Different, more basic layout and no javascript or Knockout. Would that be accepted by Google, Yahoo etc or could it be viewed as an attempt to misguide normal users? Is this a commonly used method? Has it any "standardized" method of implementation, for instance a subdomain I could use like seo.mysite.com?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Sue5673885

1 Comments

Sorted by latest first Latest Oldest Best

 

@BetL925

Serving different content to search engine spiders is known as "cloaking" and it is specifically against the webmaster guidelines published by the search engines:


Some examples of cloaking include:

Serving a page of HTML text to search engines, while showing a page of images or Flash to users




Google does recommend that if your site is heavily JavaScript dependent you can use the <noscript> tag:


Place the same content from the JavaScript in a <noscript> tag. If you use this method, ensure the contents are exactly the same as what’s contained in the JavaScript, and that this content is shown to visitors who do not have JavaScript enabled in their browser.




Google also publishes a guide to making AJAX powered sites crawlable. To make this work you need to:


Indicate to crawlers that your site supports crawlable AJAX by using #! in your URLs where you would normally use just a #.
Use <meta name="fragment" content="!"> for pages (such as your home page) that don't have a # in the URL
Set up your server to handle requests for URLs that contain _escaped_fragment_




Another way to implement this would be to use JavaScript "pushstate". Pushstate is an alternative to the "hash bang" AJAX crawling that I outlined above. Google's Matt Cutts recently said:


A correctly implemented site that uses pushstate typically doesn't need any extra support for us to be able to crawl it.


Here is a guide from moz.com on using pushstate to make a crawlable client side powered website that is still SEO friendly.



Since you have pages with URLs for your product pages already. One quick thing that you should do it to create an XML sitemap of all those product URLs. If you do that Google will be able to find all those URLs and index them.

The only caveat is that just having those pages in a sitemap will not help them rank well. You would still need to implement a way for Googlebot to be be able to see links into them from other places on your site if they need to rank better than your competitors' pages in the search results.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme