Mobile app version of vmapp.org
Login or Join
Sue5673885

: Are bad backlinks causing thousands of 404 and 410 errors in webmaster tools? Our webmaster tools account is showing 250.000 errors related with weird links from other sites. These URLs are

@Sue5673885

Posted in: #410Gone #Backlinks #GoogleSearchConsole

Our webmaster tools account is showing 250.000 errors related with weird links from other sites.

These URLs are comming mostly from non existent sites or are being generated directly by our website.

Here some examples of these URLs:

oursite.com/&q=videos+caseros+sexo+pornos+gratis&sa=X&ei=R638T8eTO8WphAfF2vG8Bg&ved=0CCAQFjAC%2F%2Fpage%2F2%2Fpage%2F3%2Fpage%2F4%2Fpage%2F3%2Fpage%2F4%2Fpage%2F3%2Fpage%2F4%2Fpage%2F5%2Fpage%2F4/page/3


Our site is a popular spanish adult site, yet we don´t have keywords which are being mentioned in this URL. Apparently this link comes from our site.

Some more examples:

oursite.com/&q=losmejoresvideosporno&sa=X&ei=U__8T-BnqK7RBdjmhYsH&ved=0CBUQFjAA%2F%2Fpage%2F2%2Fpage%2F3%2Fpage%2F2%2Fpage%2F3%2Fpage%2F2%2Fpage%2F3%2Fpage%2F4%2Fpage%2F3%2Fpage%2F2%2Fpage%2F3/page/4


Once again: not our queries, not out URLs.

oursite/tag/tetonas


We think that it might be other site, which is having a policy of extremely bad SEO based on other sites branding and keywords usage:

thirdsite/buscador/tetonas-oursite


The question is: if other sites are generating these URLs, how can we prevent this?

Why the tag is being generated if no link was added to the other site?

What should we do with these errors? 301? 410 gone?

I have read all similar Q&A here but none of them seems to solve our problem. It is not likely to be a bad ad (Inspected them all). Maybe some all content which google decided to recrawl suddenly? Maybe third parties bad SEO policy? Maybe all of them?

10.05% popularity Vote Up Vote Down


Login to follow query

More posts by @Sue5673885

5 Comments

Sorted by latest first Latest Oldest Best

 

@Lee4591628

Since these all appear to be parameter-based you could try using the URL Parameters feature in Webmaster Tools to tell Google to ignore them.

Most of the foreign parameters will likely not apply (like the ved & ei in your examples) so you can add those and hopefully it will cut down on the number of 404's in your crawl errors .

10% popularity Vote Up Vote Down


 

@Radia820

404 errors caused by backlinks from external links don't hurt your website. It's been said over and over again by Google. They just ignore it. If that link never existed, you shouldn't worry about it.

10% popularity Vote Up Vote Down


 

@Heady270

Those errors are likely not causing any problems for your site. Google's John Mueller says:


404 errors on invalid URLs do not harm your site’s indexing or ranking in any way. It doesn’t matter if there are 100 or 10 million, they won’t harm your site’s ranking. googlewebmastercentral.blogspot.ch/2011/05/do-404s-hurt-my-site.html


With that in mind, 404 is a very appropiate response for those URLs. Whatever, the link is to, your server can't figure it out.

410 would not be an appropriate response. That would mean that you had the resource and it is now gone.

A 301 redirect could be appropriate. I generally redirect away from unknown query parameters. You might consider redirecting oursite.com/&.* to the home page. Of course, then Google would just treat it as a "soft 404" and it would still show up in the webmaster tools error report.

Another possibility would be to redirect those queries to your site search. Redirecting oursite.com/&q=videos+caseros+sexo+pornos+gratis&sa=X&ei=R... to oursite.com/search?q=videos+caseros+sexo+pornos+gratis would actually show results for any content that you do have. So any real users that happened upon the link would be happy. Because site search has to be blocked in your robots.txt file, Googlebot would not crawl any of the URLs after the redirect and would therefore not complain about 404s anymore.

Another option might be to just block the URLs in robots.txt outright. You could use a wildcard match that is understood by Googlebot:

Disallow: /*&q=


Webmaster tools might still complain about all the ones that it found before you put that rule into place, but Googlebot would never crawl new ones.

10% popularity Vote Up Vote Down


 

@Ann8826881

What you don't want to do is submit a 250k+ list to Google's disavow tool. They will never look at it. From what I understand, the disavow tool has been overused anyway and doesn't provide much fruit.

I would recommend trying to find a pattern in the bad URLs. Check the "linked from" tab and see what sites are sending this bad traffic. We had a thousand errors that were coming from 4 total domains to our site. In this situation you can use disavow to tell Google "don't pay attention to any links coming from this site to ours."

If they stem from your site, then Google is indexing search results or pagination pages, at which point you should probably utilize some form of the canonical tag and similar solutions as well as define your URL parameters in GWMT, and check for internal linking to bogus pages.

I'm afraid there is no "quick fix" here, but rather a myriad of things that you need to do in order to slowly untangle this web. I would be willing to bet your situation consists of a little of both (bad back links and architectural issues.)

10% popularity Vote Up Vote Down


 

@Sims2060225

Google Webmaster tools warns against using 301's to deal with non existent URL's. They recommend letting these 404, and drop off naturally from Google's index.

As frustrating as it can be, I know because I've had the same issue. If you 301 a bad incoming backlink to a page you DO have, it can show Google that these links are/were real. Only use a 301 if a legitimate page existed and the URL changed, or 410 if you had a real page that you just decided to drop and not move the content.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme