: Can I drop multiple URLs from Google's Cache using a wildcard in a query string? I made a mistake with one of my sites and a whole bunch of data got picked up by Google before I caught
I made a mistake with one of my sites and a whole bunch of data got picked up by Google before I caught it. I need to fix this relatively fast by asking Google to drop the cached copy of the pages and then re-spider as they wish. The URL would be something like:
domain.com/memberprofile.php?member=XXXXXX
Does anyone know if I can submit:
domain.com/memberprofile.php?member=
as a single request through webmaster tools? If not, what are my other alternatives?
More posts by @Barnes591
2 Comments
Sorted by latest first Latest Oldest Best
I read a Google article saying that if you block the robots from reading the pages then Google doesn't crawl them but also does not know that they are no longer there - prolonging the pages from disappearing from your Webmaster Tools report
I don't think you can delist with a wildcard. Plus a manual delist only lasts for 90 days. You would be better off making a correct robots.txt and letting Google reindex. They should drop your pages that are no longer meant to be crawled, but I don't know how quickly that will occur.
See: www.google.com/support/webmasters/bin/answer.py?answer=156449&hl=en
You should also add the "noindex" meta tag to your memberprofile.php pages.
See: www.google.com/support/webmasters/bin/answer.py?answer=93710
EDIT: Did a little more digging...
Instructing Google to drop just the cache, not the entire page, from the index.
See: www.google.com/support/webmasters/bin/answer.py?answer=164734
Proper URLs submitted mention nothing about wildcards, and the section about "Multiple URLs" says to submit each one separately. I don't think wildcards are supported, possibly by design.
See: www.google.com/support/webmasters/bin/answer.py?answer=63758
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.