Mobile app version of vmapp.org
Login or Join
Karen161

: Robots.txt blocking specific files also block unnecessary URLs I am using Magento for one of my site. In Magento there is a file mage (no extension file name is mage only) to block this file

@Karen161

Posted in: #Magento #RobotsTxt #Url #WebCrawlers

I am using Magento for one of my site. In Magento there is a file mage (no extension file name is mage only) to block this file I write robots.txt as

# Files
User-agent: *
Disallow: /mage


But this also block URLs start with mage like magenta-color-item.html.

How I write in robot to block mage only not URL start with mage?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Karen161

1 Comments

Sorted by latest first Latest Oldest Best

 

@BetL925

You can add a dollar sign to the end of the string which means it will only match exactly that entry:

# Files
User-agent: *
Disallow: /mage$


This will only block the mage file if it come straight after the root domain:


example.com/mage

If there are any other preceding directories, you must add these o the entry. So to block the file located below:
example.com/somedirectory/mage

You would need to use:

# Files
User-agent: *
Disallow: /somedirectory/mage$

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme