Mobile app version of vmapp.org
Login or Join
Shelley277

: Accidentally made 100 Pages public - can I 404 them and add them back later? We're running a website and wanted to offer some vouchers to our signed up members, too. At the beginning it should

@Shelley277

Posted in: #DuplicateContent #Seo

We're running a website and wanted to offer some vouchers to our signed up members, too. At the beginning it should be only a blank page with unvalidated voucher codes from affiliate networks but - as stated - only for members.

Accidentally that pages went visible for all and google indexed them. The first day they ranked good as I can see in the search console but after that it seems that they got a penalty. They can only be found with very specific keywords.

Well, it was planned that they will be made available for public but only after adding some content and also manually validating all the vouchers and texts.

What I need to know now is if I just can remove them and start readding that particular pages after they have some content or is it bad to just 404 them if I plan to add them back later? If it's bad, how about adding a rel="noindex" and remove that tag some day?

I am scared that this all will harm the other pages from us, too. Also I want to get rid of the penalty.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Shelley277

1 Comments

Sorted by latest first Latest Oldest Best

 

@Angela700

Accidentally that pages went visible for all and Google indexed them. The first day they ranked good as I can see in the search console but after that it seems that they got a penalty....


I have a gut feeling Google catches on to things when people leave unfinished pages available for a set period of time. As for me, I happen to test, test, and test again through various web browsers, devices and even webpagetest.org immediately after making at least one change to my website, that way, if a major mistake happens, the odds of Google noticing it will substantially drop, and plus you're minimizing any downtime and/or frustration clients will experience.


What I need to know now is if I just can remove them and start reading that particular pages after they have some content or is it bad to just 404 them if I plan to add them back later? If it's bad, how about adding a rel="noindex" and remove that tag some day?


Because they're supposed to be private, you should set them up at new URLs. For example, if your live private page is at example.com/private/one.php you could easily rename the file on the server so the URL becomes example.com/private/onethesecond.php. Also, when renaming the files, you have a couple of options to guarantee the pages wont be indexed by search engines:


Create the file robots.txt and add the new URLs to it next to 'disallow: ' and save it to the document root folder of your site. This method is easy however, hackers may learn about your private files, but since you indicated you're nervous, you can use this as a temporary band-aid solution.
Follow the comments above and add the noindex meta-tag to each page that should not be indexed. This is recommended.
Change your script and/or server configuration so that any IP address that belongs to servers of search engines you don't want the pages listed on are blocked from access. This may be difficult if you aren't too tech-savvy.


Search engines like Google will continue to scan the old URLs. For this reason, you need to point the old URL's to error pages with the 410 status code meaning GONE. The 404 status code of NOT FOUND means that at the moment the URL goes to a document not found and that you're making a promise that it will be found sometime soon.

If you use an apache server with mod_rewrite enabled, then directing old pages to 410 status is relatively easy. Just make an .htaccess file (or use apache main configuration file httpd.conf) and enter the following:

RewriteEngine On
RewriteRule ^theurihere$ [R=410,L]
RewriteRule ^theurihere$ [R=410,L]
...
RewriteRule ^theurihere$ [R=410,L]


and replace each theurihere with each URI that Google is trying to index that shouldn't be indexed. by URI, I mean the part after example.com. For example, if your URL to not index is example.com/one/two then replace theurihere with one/two.

Just make sure you test everything when you are done, and to verify Google is doing everything correctly, use webmaster tools (now called search console) and register your domain name with it and check the crawl errors report. If all goes well, you should see your list of the URLs that were meant to be private with error code 410 showing up.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme