Mobile app version of vmapp.org
Login or Join
Bethany197

: Which content or HTTP status should return unused domain? We've recently bought couple of domains but the content of the domains is still under development. Which content should we upload to

@Bethany197

Posted in: #Domains #Html #Http #Seo

We've recently bought couple of domains but the content of the domains is still under development.

Which content should we upload to the domains to tell search engines "There is currently nothing, but check in the future" and don't get crawling/ranking penalty for the future content?

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Bethany197

3 Comments

Sorted by latest first Latest Oldest Best

 

@Karen161

Use robots.txt to disallow as RandomBen suggests. Search engines crawlers may not come back very often if it finds all your content is disallowed.

However, once your site is up and ready, change your robots.txt and resubmit to Google (from webmaster tools), Bing, etc.

There should be no penalty and when you resubmit your new robots.txt the search engines will begin to crawl and index your site properly.

10% popularity Vote Up Vote Down


 

@Ann8826881

I see few possible approaches (in descending order):


Just restrict access by IP: only allow certain IPs or subnets (in case your IP is dynamic) to access the site and block all other requests - the most reliable approach.
Display empty page or some "Coming Soon" text (or just nice big image and no text at all) and use subdomain to access your current live site (for example: beta.example.com). Unfortunately it is not that reliable as blocking by IP as search engine still may detect the subdomain or discover a link to it (Google Desktop search, links in emails sent to @gmail .com address, someone publishes a link somewhere and it was indexed etc).
Use robots.txt -- this will work fine with bots that respect it, but offer no "protection" otherwise. If your only concern is Google/Bing etc then it should work fine.

10% popularity Vote Up Vote Down


 

@Mendez628

I would add a robots.txt file to all of these sites with the code below in it if you are worried about Google attempting to crawl your site:

User-agent: *
Disallow: /


Remember, Google needs to know your site exists before they will even scan it. So if you haven't created any links to the site or a Google Webmaster/Google Analytics account for the site yet, Google probably won't know about it. In addition, Google should not penalize you for having an empty site once you do have a site. That is a normal state for many domains before they are setup.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme