Mobile app version of vmapp.org
Login or Join
LarsenBagley505

: How can I make an AngularJS ecommerce website locally crawlable? I'm trying to crawl our ecommerce website that runs on AngularJS in an effort to create a content inventory. I'm not really very

@LarsenBagley505

Posted in: #WebCrawlers

I'm trying to crawl our ecommerce website that runs on AngularJS in an effort to create a content inventory. I'm not really very good at coding.

So far, I've tried DeepCrawl and Screaming Frog, but I'm unable to extract the data I need. It goes as far as the Home Page and then the crawl stops.

Is there anything I would need to set up on the backend or add to the markup itself so it's crawlable?

10.04% popularity Vote Up Vote Down


Login to follow query

More posts by @LarsenBagley505

4 Comments

Sorted by latest first Latest Oldest Best

 

@Pierce454

We can crawl Angular that has the _escaped_fragment_ solution implemented - but not without I'm afraid.

There's a cracking article by a client of ours here where he explains how he indexed a site built with Angular - you might find this interesting!

Crawling JS is on the product roadmap; watch this space...

10% popularity Vote Up Vote Down


 

@Cofer257

Screaming Frog now has a 'rendering' configuration for crawling JavaScript frameworks such as AngularJS - www.screamingfrog.co.uk/seo-spider-6-0/.

10% popularity Vote Up Vote Down


 

@Miguel251

You're attempting to SEO a javascript rendered site. The issue is that crawlers don't tend to render the full site, as they're skipping the majority of javascript. In this case, it's indexed static text on the front page.

You can attempt to add pages individually to a sitemap and request a re-index in Search Console. Google has gotten better at this in recent times but honestly it's still not quite there. They still don't handle javascript sites very well, it's very resource intensive. As mentioned in another post, you can use software to make it crawlable. Another example is prerender.io. What you need is to re-present the site to the crawler as a static site.

10% popularity Vote Up Vote Down


 

@Harper822

Not that this is supposed to be the place to recommend software, but I will advise using Xenu Link sleuth found at: home.snafu.de/tilman/xenulink.html
It will crawl every single page on your site connected to the URL you specify and there are no restrictions to how large the list can go, plus you can save the list as a tab separated file.

The only way markup can control crawl-ability is if the robots meta tag was used and that would only control search engine's ability to crawl your website.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme