: Robots.txt blocking specific files also block unnecessary URLs I am using Magento for one of my site. In Magento there is a file mage (no extension file name is mage only) to block this file
I am using Magento for one of my site. In Magento there is a file mage (no extension file name is mage only) to block this file I write robots.txt as
# Files
User-agent: *
Disallow: /mage
But this also block URLs start with mage like magenta-color-item.html.
How I write in robot to block mage only not URL start with mage?
More posts by @Karen161
1 Comments
Sorted by latest first Latest Oldest Best
You can add a dollar sign to the end of the string which means it will only match exactly that entry:
# Files
User-agent: *
Disallow: /mage$
This will only block the mage file if it come straight after the root domain:
example.com/mage
If there are any other preceding directories, you must add these o the entry. So to block the file located below:
example.com/somedirectory/mage
You would need to use:
# Files
User-agent: *
Disallow: /somedirectory/mage$
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.