: If my URL's are static, but then parsed by Javascript, does that make it crawlable? Lets say I have a link: <a href="/about/">About Us</a> But in Javascript [or jQuery] catches it
Lets say I have a link:
<a href="/about/">About Us</a>
But in Javascript [or jQuery] catches it and then adds the hash based off of the href attribute:
$('a').click(function(e) {
e.preventDefault();
// Extremely oversimplified..
window.location.hash = $(this).attr('href');
});
And then we use a hashchange event to do the general 'magic' of Ajax requests. This allows for the actual href to be seen by crawlers, but gives client-side users with JS enabled an ajax-based website.
Does this 'help' the general SEO issues that come along with hashtags? I know hashbangs are 'ok', but afaik they aren't reliable?
More posts by @Kaufman445
1 Comments
Sorted by latest first Latest Oldest Best
Yes, this is search engine friendly and a good example of progressive enhancement. Because the links are still crawlable and load the same content as with JavaScript so Google, and any user without JavaScript enabled, can still find the content just fine. Your users with JavaScript will get the added benefit of a faster page load since they don't need to wait for the whole page to load when they click the link.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.