Mobile app version of vmapp.org
Login or Join
Margaret670

: Removing Google Webmaster URL Parameters Some time before I have created a post for removing full site from Google Index, after following many advices I have removed large number of pages from

@Margaret670

Posted in: #Googlebot #GoogleIndex #GoogleSearchConsole

Some time before I have created a post for removing full site from Google Index, after following many advices I have removed large number of pages from Google Index but now facing another problem as removing the Google Webmaster Tools URL Parameters.

My website total index pages has come down from about 5,000,000 Pages to about 97,000 but the URL Parameter still shows the following things:

parameter effect Crawl

passed 1,274,056 Narrows No Url
searchterm 1,269,622 Narrows No Url
court 1,265,840 Narrows No Url
sel 1,265,502 Narrows No Url
page 1,187,018 Narrows No Url


But the URL parameters are not decreasing. If I re-include the website in Google Index and pages Index Goes up smartly as the parameters are still in the Google Program.

Another thing I want to know if the indexed pages are 97000 then how the parameters are 1,200,000, I didn't understand it how

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Margaret670

1 Comments

Sorted by latest first Latest Oldest Best

 

@Cofer257

To cover your last point first, if you have 97k URLs indexed but 1.2m URLs found with a parameter, then Googlebot is ignoring many URLs (over 1.1m in fact).

As for the main question, Webmaster Tools states when you set the option "No URL" that they may remove URLs from the index - so there is no guarantee they will be removed.

You should go through those parameters again and make sure you definitely have the "Effect" setting right. For example you have put that page narrows content but it sounds like it actually paginates content. Even if it does narrow (i.e. filter) content, Googlebot may have decided it's useful to show those pages in search results.

Finally, if you truly want to remove ALL pages with those parameters from Google then robots.txt is a much better method. Something like the following should work:

User-agent: *
Disallow: *?passed=*
Disallow: *&passed=*


…and so on for each parameter. Remember to be careful and double check what you are blocking, otherwise you could remove many pages from search.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme