Mobile app version of vmapp.org
Login or Join
Kaufman445

: Does e.preventDefault on link prevent the page from being crawled? I have a couple of links in navigation on homepage that lead to new pages. When clicked on link, i want to prevent going

@Kaufman445

Posted in: #Javascript #Seo #WebCrawlers

I have a couple of links in navigation on homepage that lead to new pages. When clicked on link, i want to prevent going on that page (e.preventDefault()), and instead load that page in popup with ajax. JS disabled users would go to the page from link.
Can this hurt SEO in any way? Would those pages get crawled?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Kaufman445

1 Comments

Sorted by latest first Latest Oldest Best

 

@Courtney195

The pages will be found and most likely be crawled. You're not far from the truth if you assume bots don't use Javascript.

A crawler works similar to this:


Go to a webpage and get it's contents
Get all the information from the page, like keywords and all other SEO checks
Get all internal and external links from the source

Per link, go to step 1.



It takes the source, not the page the user sees. Some bots do excecute some javascript, but it won't stop them finding and crawling the urls.

If you want bots not to follow the urls on a page, there is NOINDEX (says: Do not index this page) and NOFOLLOW (index this page, but don't follow the urls. More information about the robots meta.

Another option is to use robots.txt, a small textfile where you can add url's to pages you don't want them to index. This does depend on trusting they'll actually listen to it (but they most commonly do).

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme