Mobile app version of vmapp.org
Login or Join
Kristi941

: Fetch as Google does not result in child pages getting indexed I work on an exotic pet website which currently has several types/species of reptiles. It has done well in SERP for the first

@Kristi941

Posted in: #Googlebot #Seo #WebCrawlers

I work on an exotic pet website which currently has several types/species of reptiles. It has done well in SERP for the first couple of types of reptiles, but I am continuing to add new species and for each of these comes the task of getting ranked and I need to figure out the best process.

We just released our 4th species, "reticulated pythons", about 2 weeks ago, and I made these pages public and in Webmaster tools did a "Fetch as Google" and index page and child pages for this page: www.morphmarket.com/c/reptiles/pythons/reticulated-pythons/index
While Google immediately indexed the index page, it did not really index the couple of dozen pages linked from this page despite me checking the option to crawl child pages. I know this by two ways: first, in Google Webmaster Tools, if I look at Search Analytics and Pages filtered by "retic", there are only 2 listed. This at least tells me it's not showing these pages to users. More directly though, if I look at Google search for "site:morphmarket.com/c/reptiles/pythons/reticulated-pythons" there are only 7 pages indexed.

Any advice on what could be going wrong here. I really want Google to index the top couple of links on this page (home, index, stores, calculator) as well as the couple dozen gene/tag links below.

Update about 1 Month Later:

I'm now seeing pages getting indexed like I had hoped. It's hard to tell if submitting the URLs even helped, since Google at this point could have been picking it up on its own.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Kristi941

1 Comments

Sorted by latest first Latest Oldest Best

 

@Odierno851

Have you checked "Crawl this URL and its direct links" on submit option?



You can do that only 10 times per month. And additionally you can submit 500 single URL in fetch and render tools.

What I want to say is, if there are more links on that page, then Google will not crawl and index all the pages on same time/day, They will schedule their works, your links are queue in cralwer pipeline, you just have to wait, there are thousand of people submit URLs to Google and crawler have to crawl only specific number of pages on certain time.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme