Mobile app version of vmapp.org
Login or Join
Kristi941

: Gibberish 404 Crawl Errors in Google webmaster tools I am seeing a bunch of 404 errors in the Crawl Errors section of Google Webmaster tools with "gibberish" URLs, that look like: https://example.com/baqmZ9/P12V0aYabJIjnAhH5

@Kristi941

Posted in: #GoogleSearchConsole

I am seeing a bunch of 404 errors in the Crawl Errors section of Google Webmaster tools with "gibberish" URLs, that look like:

example.com/baqmZ9/P12V0aYabJIjnAhH5mYp/4w967mBzt9iuEWqTtRPxtuUZwaKK3WiHB6vhUJ0nGTUthJCFUThfurFOwe4FT3yJ/QjoVjrT5/k6+tQ==/ https://example.com/asKiRlXposNi466jsUy5g0G6nhezKqoUBl5AOIpGs4KxiSNraYmnxTUaQXtadWXRM7AZyECaVVwaNcKKn99RakcYQU3NELl6lv8KuC6Czzg==/


Where example.com is my site's domain name.

There are around 400 of them, and they do not have Linking information like most other crawl errors.

Any ideas as to what maybe causing this?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Kristi941

2 Comments

Sorted by latest first Latest Oldest Best

 

@Heady270

The == at the end makes me think that this is something that has been base 64 encoded. The equals signs are used as padding at the end of base 64 encoding to make it come out to a multiple of four bytes.

Google could be picking it up from any number of sources:


Badly formatted data URI images.
JavaScript code containing base 64 data in strings (Google does scan JavaScript and it will crawl things in JavaScript that look like they could be URLs.
Other sites that are linking to your site incorrectly.


Here is what Google's John Mueller (who works on Webmaster Tools and Sitemaps) has to say about 404 errors that appear in Webmaster tools:


HELP! MY SITE HAS 939 CRAWL ERRORS!!1

I see this kind of question several times a week; you’re not alone - many websites have crawl errors.


404 errors on invalid URLs do not harm your site’s indexing or ranking in any way. It doesn’t matter if there are 100 or 10 million, they won’t harm your site’s ranking. googlewebmastercentral.blogspot.ch/2011/05/do-404s-hurt-my-site.html
In some cases, crawl errors may come from a legitimate structural issue within your website or CMS. How you tell? Double-check the origin of the crawl error. If there's a broken link on your site, in your page's static HTML, then that's always worth fixing. (thanks +Martino Mosna)
What about the funky URLs that are “clearly broken?” When our algorithms like your site, they may try to find more great content on it, for example by trying to discover new URLs in JavaScript. If we try those “URLs” and find a 404, that’s great and expected. We just don’t want to miss anything important (insert overly-attached Googlebot meme here). support.google.com/webmasters/bin/answer.py?answer=1154698 You don’t need to fix crawl errors in Webmaster Tools. The “mark as fixed” feature is only to help you, if you want to keep track of your progress there; it does not change anything in our web-search pipeline, so feel free to ignore it if you don’t need it.
support.google.com/webmasters/bin/answer.py?answer=2467403 We list crawl errors in Webmaster Tools by priority, which is based on several factors. If the first page of crawl errors is clearly irrelevant, you probably won’t find important crawl errors on further pages.
googlewebmastercentral.blogspot.ch/2012/03/crawl-errors-next-generation.html There’s no need to “fix” crawl errors on your website. Finding 404’s is normal and expected of a healthy, well-configured website. If you have an equivalent new URL, then redirecting to it is a good practice. Otherwise, you should not create fake content, you should not redirect to your homepage, you shouldn’t robots.txt disallow those URLs -- all of these things make it harder for us to recognize your site’s structure and process it properly. We call these “soft 404” errors.
support.google.com/webmasters/bin/answer.py?answer=181708 Obviously - if these crawl errors are showing up for URLs that you care about, perhaps URLs in your Sitemap file, then that’s something you should take action on immediately. If Googlebot can’t crawl your important URLs, then they may get dropped from our search results, and users might not be able to access them either.

10% popularity Vote Up Vote Down


 

@LarsenBagley505

I have an instant fix for you and I'm gonna assume your server is apache based.

Create an .htaccess file in the document root of your website or open up the existing one and add the following to the bottom of it after all other rewrite rules:

RewriteEngine On
RewriteRule ^uselessurl(.*)$ - [R=410,L,NC]


and replace uselessurl with anything after example.com/ thats in the url that you are sure is not supposed to be a valid URL that google thinks returns a 404. In the above code if you left it as is, then any url that starts with example.com/uselessurl will result in error 410.

When google sees error 410, it will report it once, then it wont report it again after, even if you indicate its fixed because 410 means gone. 404 means not found with the possible intention that the link might be available later.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme