Mobile app version of vmapp.org
Login or Join
Vandalay111

: How to disallow bulk urls from robots.txt? I built a dynamic sitemap generator on my platform. By mistake I generated some 300+ wrong url along with the right ones. And they have been there

@Vandalay111

Posted in: #RobotsTxt #Seo

I built a dynamic sitemap generator on my platform. By mistake I generated some 300+ wrong url along with the right ones. And they have been there for couple weeks now. Recently I found out this mistake. Some of those wrong urls are already indexed by google. I spoke with one SEO specialist and he told me to follow two steps:


Add all of those wrong url to Remove Url on web masters.
Next Disallow all those url from robots.txt.


I did the first step.
I don't know how to do the second step. I have some 300+ urls like bellow:
example.com/equip?category_id=semi-automatic https://example.com/equip?category_id=automatic example.com/equip?category_id=other


Currently I cant implement 404 on these urls from coding side. Its quiet complicated now. So could any body please tell me how can I disallow these from robots.txt ??

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Vandalay111

1 Comments

Sorted by latest first Latest Oldest Best

 

@Karen161

You have to define first the engines you want them to skip your pages with user-agent (use * for all engines). Then you can specify the list of relative urls you want to remove. Each element preceded by Disallow:

User-agent: *
Disallow: /equip?category_id=semi-automatic
Disallow: /equip?category_id=automatic
Disallow: /equip?category_id=other

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme