Mobile app version of vmapp.org
Login or Join
Nickens628

: Forcing google to rescan sitemaps instantly When URLs no longer exist on my site, I give a page and issue an HTTP 410 status (the GONE error). I created a script that allows my co-administrator

@Nickens628

Posted in: #410Gone #Error #Google #Googlebot #Sitemap

When URLs no longer exist on my site, I give a page and issue an HTTP 410 status (the GONE error).

I created a script that allows my co-administrator to add and remove pictures from the site and when pictures are removed, that picture page will instead produce an error page and return an HTTP 410 status code to the browser and within approximately five seconds, the sitemaps are updated to reflect the changes.

I also have every page set with the robots noarchive metatag like so:

<meta name="GOOGLEBOT" content="NOARCHIVE">
<meta name="ROBOTS" content="NOARCHIVE">


Everyday when I visit webmaster tools, I get a random number of error URLs all pointing to the photo pages.

Even though google is correct at stating they return 410 status code, URLs still appear in webmaster tools even though the co-adminstrator removed the photo pages through the script and thus all links on the site to the affected URLs as well as all references to the affected URLs from the sitemaps are removed.

I also noticed that the more errors google sees on my site (regardless of whether the status codes are 404 or 410) the less I make in adsense.

I also have configured googlebot to make a max of 10 requests per second (slider all the way to the right).

What can I do to lower the odds of google seeing the newly generated error URLs as a result of removing bad pictures from the site so that I don't see 410 status codes in webmaster tools?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Nickens628

1 Comments

Sorted by latest first Latest Oldest Best

 

@Margaret670

You can't force Google to crawl/rescan your sitemap instantly.

Once Google index any of webpage from your website, it start crawling again and again i.e. they crawl webpages from their own indexed database. So if you removed those pages from sitemap, even from your website which you linked to, then still that pages will able to crawl for Googlebot.

Normally Googlebot crawl 404 pages often, because they think, webmaster know this error is encounter in their webmaster dashboard, so may be they will fixed one day, and hence they crawl 404 pages often, So 410 error(permanently gone) is good to display in specific case.

I think Unavailable after meta tags, is right solution for your website.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme