Mobile app version of vmapp.org
Login or Join
Phylliss660

: How to secure robots.txt file? I would like for User-agents to index my relative pages only without accessing any directory on my server. As initial thought, i had this version in mind:

@Phylliss660

Posted in: #Google #RobotsTxt #Seo #Sitemap #UserAgent

I would like for User-agents to index my relative pages only without accessing any directory on my server.

As initial thought, i had this version in mind:

User-agent: *
Disallow: */*

Sitemap: www.mydomain.com/sitemap.xml

My Questions:


Is it correct to block all directories like that - Disallow: */*?
Would still search engines be able to see and index my sitemap if i disallowed all directories?
What are the best practices for securing the robots.txt file?


For Reference:

Here is a good tutorial for robots.txt
#Add this if you want to stop Alexa from indexing your site.
User-agent: ia_archiver
Disallow: /
#Add this to stop duggmirror
User-agent: duggmirror
Disallow: /
#Add this to allow specific agents
User-agent: Googlebot
Disallow:

#Add this to allow all agents while blocking specific directories
User-agent: *
Disallow: /cgi-bin/
Disallow: /*?*

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Phylliss660

1 Comments

Sorted by latest first Latest Oldest Best

 

@Sarah324

That's going to block your entire website from being crawled.
No
There is no such thing as securing your robots.txt. If you don't want to keep visitors out of your directory root you need to prevent that using more secure means. Putting a blank index.html file will easily do the trick. If you're running Apache you can also do it easily using htaccess.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme