: How can I make an AngularJS ecommerce website locally crawlable? I'm trying to crawl our ecommerce website that runs on AngularJS in an effort to create a content inventory. I'm not really very
I'm trying to crawl our ecommerce website that runs on AngularJS in an effort to create a content inventory. I'm not really very good at coding.
So far, I've tried DeepCrawl and Screaming Frog, but I'm unable to extract the data I need. It goes as far as the Home Page and then the crawl stops.
Is there anything I would need to set up on the backend or add to the markup itself so it's crawlable?
More posts by @LarsenBagley505
4 Comments
Sorted by latest first Latest Oldest Best
We can crawl Angular that has the _escaped_fragment_ solution implemented - but not without I'm afraid.
There's a cracking article by a client of ours here where he explains how he indexed a site built with Angular - you might find this interesting!
Crawling JS is on the product roadmap; watch this space...
Screaming Frog now has a 'rendering' configuration for crawling JavaScript frameworks such as AngularJS - www.screamingfrog.co.uk/seo-spider-6-0/.
You're attempting to SEO a javascript rendered site. The issue is that crawlers don't tend to render the full site, as they're skipping the majority of javascript. In this case, it's indexed static text on the front page.
You can attempt to add pages individually to a sitemap and request a re-index in Search Console. Google has gotten better at this in recent times but honestly it's still not quite there. They still don't handle javascript sites very well, it's very resource intensive. As mentioned in another post, you can use software to make it crawlable. Another example is prerender.io. What you need is to re-present the site to the crawler as a static site.
Not that this is supposed to be the place to recommend software, but I will advise using Xenu Link sleuth found at: home.snafu.de/tilman/xenulink.html
It will crawl every single page on your site connected to the URL you specify and there are no restrictions to how large the list can go, plus you can save the list as a tab separated file.
The only way markup can control crawl-ability is if the robots meta tag was used and that would only control search engine's ability to crawl your website.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.