Mobile app version of vmapp.org
Login or Join
Fox8124981

: Having issues getting new pages indexed in Google We are having issues getting our new pages crawled by Google and cannot figure out why. We've done lots of research and are still having trouble

@Fox8124981

Posted in: #Google #GoogleIndex #Indexing #WebCrawlers

We are having issues getting our new pages crawled by Google and cannot figure out why. We've done lots of research and are still having trouble finding answers. Here's some information regarding our current situation:


10 million pages indexed, and tens of thousands of webpages added daily
updated web site
virtually no crawl for "new pages" / only re-crawl (or checking) of old pages - 5 million pages checked
We doubt google crawler has recognized our changes and are worried the Crawler might think it has to re-crawl all of the older pages as well.
rolled back web site last week
still no crawl for "new pages"


Any help on what we should do for our new pages to be crawled?

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Fox8124981

3 Comments

Sorted by latest first Latest Oldest Best

 

@Becky754

If your site already had 10 million pages indexed, I would say double-check that your site is crawlable and a sitemap is submitted through Google Webmaster Tools. Google does prefer sitemaps for larger sites such as yours.

It takes time for Google to notice changes sometimes especially if there is a major change with a ton of pages. It can take as much as 30 days or even more sometimes before Google figures out that a major change has happened. It will sample your site then index aggressively depending upon what it sees.

I would stay the course with your updated site making sure Google can crawl it. The more gyrations Google sees, the slower it goes sometimes. So rolling back the site may actually slow things down.

10% popularity Vote Up Vote Down


 

@Bryan171

You could try and promote the site a bit more and get more inbound links, and/or ensure that your server can cope with a bigger crawl. See below, (but the whole answer linked is worth a read) :


There is also not a hard limit on our crawl. The best way to think
about it is that the number of pages that we crawl is roughly
proportional to your PageRank. So if you have a lot of incoming links
on your root page, we'll definitely crawl that. Then your root page
may link to other pages, and those will get PageRank and we'll crawl
those as well. As you get deeper and deeper in your site, however,
PageRank tends to decline.

Another way to think about it is that the low PageRank pages on your
site are competing against a much larger pool of pages with the same
or higher PageRank. There are a large number of pages on the web that
have very little or close to zero PageRank. The pages that get linked
to a lot tend to get discovered and crawled quite quickly. The lower
PageRank pages are likely to be crawled not quite as often

www.stonetemple.com/articles/interview-matt-cutts-012510.shtml

10% popularity Vote Up Vote Down


 

@Alves908

set up a webmaster for your website, it is recommended by google.
It will help you in finding errors in crawling.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme