Mobile app version of vmapp.org
Login or Join
Annie201

: Should you let Googlebot crawl and index your entire site before updating it again? I have heard from various people that "Google has scanned x% of the site". Some have told me at times not

@Annie201

Posted in: #CrawlRate #Googlebot #Seo

I have heard from various people that "Google has scanned x% of the site". Some have told me at times not to upload anything, nor do any changes on the site "since Google hasn't finished scanning the site". I asked one of them what they mean by that and he told me that if you go to Webmaster Tools and at the crawl statistics there is a graph that shows pages crawled per day. Also, by the the third or fourth time a page is scanned then google's index is updated. He even told me that it is better to create a backlink from an site (that will interview me) in about a month, not now, since Google hasn't finished scanning the site after some big changes in the url structure, so the backlink will go unnoticed.

Is there any logic in the above? In which ways can you use this graph and what assumptions can you derive? Should someone ever stop doing anything "because Google is scanning"?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Annie201

2 Comments

Sorted by latest first Latest Oldest Best

 

@Berumen354

Google expects refresh activity on a website, indicating it's live and have taken into account the fact that pages do get updated. You might want to permanent redirect important pages that have moved.

If we believed such nonsense, our company website would never be updated. Google is continuously scanning from one end to the other.

Get cracking and keep it updated with new content.

You will find that most "SEO Experts" are using guesses and magical thinking, not long term A/B testing when they come up with this stuff.

10% popularity Vote Up Vote Down


 

@Jamie184

There is some logic to what I think you are saying, though mostly it really does not matter. I will give you some examples from my experience.

Yes. I have been guilty of allowing Google to index the bulk of a large update to a site before moving things around again. But I do not stop working. I create back links, new content, update content and so on. When I can, I like to make larger updates, meaning that I like to give the search engines something to chew on. Why do I do this? The answer is simple.

Search engines like freshness. It is fine to update single pages daily if that is what makes sense to how you work. But that is not how I work. I make larger sweeping changes of particular topics or over thousands of mechanized pages. So that is how I update my site. This gives search engines something to chew on and freshness becomes clearer. But smaller daily updates can do the same thing.

I also like to measure the work I just completed. Let's say I update a topic. I like to allow Google to index it and allow the changes some time to live in the SERPs for a period to know how my changes perform. Then I make more changes if necessary. I like to allow things to settle sometimes to measure performance.

But sometimes, none of that matters. There are times where I do large updates followed by more large updates just a week later. The reason is, I want to push the work out even if Google is not ready. It all comes out in the wash eventually. So if I am not gauging performance and I feel the updates are more important for user experience, then I push the updates out.

You can create back links anytime. If the page you are linking to exists, create the back link. Do not wait. Speed is your friend in this matter. However, there are times where a page is due to be updated and the page that exists is lousy. In that case, I might wait purely for the user experience and for no other reason.

So to sum up how I see making changes- I make changes anytime as fast I can for user experience unless I feel like I need to measure the effectiveness of my work. In this case, then I might push updates and changes in a way that I can measure performance. It becomes a bit more strategic at that point. But once I have metrics enough to feel comfortable to know how my work is performing, then I immediately scramble to make changes again if it applies.

Search engines like changes. There is velocity in making changes that works in your favor. Do not make scrambled eggs of your work all the time, but do make sure there are steady changes/additions to your site that search engines can see. It is okay to do daily updates/changes on one or a few pages. This works just fine. But larger changes do definitely get the search engines attention. It is always a rolling process. No search engine will spider your site all at once except from time to time but it will likely never really complete the task. So waiting for search engines to finish their work does not make sense. But waiting for search engines to notice larger updates can.

The real up-shop of the whole thing is to keep working on your site and improving it as fast as you can. It will pay off in the end. But sometimes patience helps you or the user in which case, then you may have to exercise some patience.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme