Mobile app version of vmapp.org
Login or Join
Margaret670

: Angular App and SEO Crawlability - Without Hashbangs I am reading all the articles on making a site crawlable by replacing Angular hashbangs in the URL with an escaped fragment. An example

@Margaret670

Posted in: #Ajax #Seo

I am reading all the articles on making a site crawlable by replacing Angular hashbangs in the URL with an escaped fragment. An example of such a tutorial:
lawsonry.com/2014/05/diy-angularjs-seo-with-phantomjs-the-easy-way/
However, the Angular app I'm maintaining is not truly a SPA (it uses no dynamic routing and no hashbangs in the URL); it has a conventional URL structure. However, each page's individual content is highly dynamic and is built up with client-side Javascript and AJAX calls to a web service. So I still need to figure out a crawlability solution but without the ability to use escaped fragments in the URL. Any advice?

What if I create a process to generate a sitemap.xml file nightly - and even loop through all the site URLs and prerender them into a folder full of static HTML files? The only missing piece is how to then be smart enough to serve up the static HTML to crawlers, while leaving the existing browser experience intact for human users. Can I simply add sitewide code to detect crawlers by checking the user-agent, and have it serve up the prerendered content in those cases? Or will this trigger red flags in Google's engine, thinking that I'm trying to "fool" it?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Margaret670

1 Comments

Sorted by latest first Latest Oldest Best

 

@Heady270

See Point #4 under "Transformation of the URL" in Google's crawlable AJAX documentation


If a page has no hash fragments, but contains <meta name="fragment" content="!"> in the <head> of the HTML, the crawler will transform the URL of this page from domain[:port]/path to domain[:port]/path?_escaped_fragment= (or domain[:port]/path?queryparams to domain[:port]/path?queryparams&_escaped_fragment_= and will then access the transformed URL. For example, if example.com contains <meta name="fragment" content="!"> in the head, the crawler will transform this URL into example.com?_escaped_fragment_= and fetch example.com?_escaped_fragment_= from the web server.


So the magic is to use the following meta tag:

<meta name="fragment" content="!">


Which, regardless of whether or not the URL has a hash bang, will cause Google to crawl content fetched via an _escaped_fragment.

Do not attempt to use user agent sniffing to serve different content to bots. Google considers that to be "cloaking" and will penalize sites for it.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme