: How to do with out-of-date or expired pages I have a "classified advertisement" website that user can submit their advertising posts on it. Each post will be valid for 3 weeks. After then the
I have a "classified advertisement" website that user can submit their advertising posts on it. Each post will be valid for 3 weeks. After then the post will become "expired". Expired posts will not be physically removed, we just remove the link between the website and the post. So users cannot find the expired posts from the website, but they can find the expired posts if they type in the URL of the post directly.
After the post become expired, users can create another new post with content same as the expired post so as to extend the advertising period of the posts. I have to allow this.
My problem is, google will index the expired posts and the new posts, and is reporting duplicated content found on the website.
So what's the best practice to remove the expired content from search engines?
More posts by @Bryan171
1 Comments
Sorted by latest first Latest Oldest Best
Just send out a robots no-index HTTP header or use the equivalent meta tag. This will tell Google to remove the page from their search results:
X-Robots-Tag: noindex
or
<meta name="robots" content="noindex">
You can also let Google know when a page will expire:
X-Robots-Tag: unavailable_after: 7 Jul 2007 16:30:00 GMT
or
<META NAME="GOOGLEBOT" CONTENT="unavailable_after: 25-Aug-2007 15:00:00 EST">
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.