Mobile app version of vmapp.org
Login or Join
YK1175434

: What methods are effective/necessary to quickly remove URLs from the Google index? Possible Duplicate: How to Remove URLs from Google Search Engine In short, I have ~1000 URL's that

@YK1175434

Posted in: #Google #GoogleSearchConsole #RobotsTxt #Search #Seo

Possible Duplicate:
How to Remove URLs from Google Search Engine




In short, I have ~1000 URL's that need removing from the Google index ASAP, this is as of a week or so ago, and over that time the following has been done:



The website always had a robots.txt (live for several years), I recently updated it to disallow the URL's necessary. However, this is what I see in Google Webmaster Tools (Health/Blocked URLs), note the "Downloaded Never".

I have tried resubmitting via 'Fetch as Google', which is supposedly a fix for this problem, and also a couple of removal requests of robots.txt from cache, which had no effect (over 48 hours ago).

Google occasionally hit a 302 from the robots.txt pointing to itself due to a poorly written RewriteRule, but a 200 afterwards, this has now been corrected, would this have been significant?



As per the Google Custom Search on-demand removal documentation I have also tried submitting a sitemap with backdated expires dates for the affected URLs.

Will this have any effect? +/- ?



Lastly, I have gone through the crawling stats and identified the URLs with the highest number of impressions and manually requested those for removal using the regular individual URL removal interface within Google Webmaster Tools, but obviously, this is quite time consuming, doesn't cover all the affected URLs, and I believe (from the documentation) that the number of submissions here is restricted.



Are there any obvious techniques I'm missing here or probable solutions to my robots.txt problem?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @YK1175434

1 Comments

Sorted by latest first Latest Oldest Best

 

@Murray432

You need to use your webmaster tools account to submit a removal request, annoyingly you'll have to do this one url or directory at a time, the typical response time is 48 hours in my experience.

Using robots.txt will stop Google from indexing new URLs but won't remove old ones from the index, there also appear to be significant problems getting Google to recognise new robots files and sitemaps - it's not done quickly or easily unless you're a major traffic site.

If you own the domain there should not be a problem, but if the links are on another domain you have to justify the removal and use the public url removal tool.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme