Mobile app version of vmapp.org
Login or Join
Shanna517

: Google stores user input from site URL I use both with URL param and without. For example: http://www.wwwsitus.com/code/googlemap.php http://www.wwwsitus.com/code/?q=reg&foo=verif I then type others

@Shanna517

Posted in: #GoogleSearchConsole #Url

I use both with URL param and without.
For example: www.wwwsitus.com/code/googlemap.php http://www.wwwsitus.com/code/?q=reg&foo=verif

I then type others (fake param/url) in that url, as follows: www.wwwsitus.com/code/googlemapper.php http://www.wwwsitus.com/code/?what=yeryter&next=unxrnyie

Then, 404 page appears.
However, I saw google webmaster tools store the URL (googlemapper.php) and need to fix it. What should I do with this problem? If there are 1000 wrong types will makes me crazy to fix it.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Shanna517

1 Comments

Sorted by latest first Latest Oldest Best

 

@BetL925

The 404 errors are not from user input. Rather they are from links that Google has found while crawling the web. If you click on the 404 errors in the report you can see where Googlebot found the URL. You may find that many of them are bad links to your site. Some of them are probably Googlebot being dumb.

You have several options:

Do Nothing

It is fine to leave these errors. The list of errors in Google Webmaster Tools are there for your use as a webmaster. Google does not use them to make judgments about your site. Fixing errors in that report does not improve your rankings. Leaving errors will never get your site penalized.

Here is what Google's John Mueller (who works on Webmaster Tools and Sitemaps) has to say about 404 errors that appear in Webmaster tools:


HELP! MY SITE HAS 939 CRAWL ERRORS!!1

I see this kind of question several times a week; you’re not alone - many websites have crawl errors.


404 errors on invalid URLs do not harm your site’s indexing or ranking in any way. It doesn’t matter if there are 100 or 10 million, they won’t harm your site’s ranking. googlewebmastercentral.blogspot.ch/2011/05/do-404s-hurt-my-site.html
In some cases, crawl errors may come from a legitimate structural issue within your website or CMS. How you tell? Double-check the origin of the crawl error. If there's a broken link on your site, in your page's static HTML, then that's always worth fixing. (thanks +Martino Mosna)
What about the funky URLs that are “clearly broken?” When our algorithms like your site, they may try to find more great content on it, for example by trying to discover new URLs in JavaScript. If we try those “URLs” and find a 404, that’s great and expected. We just don’t want to miss anything important (insert overly-attached Googlebot meme here). support.google.com/webmasters/bin/answer.py?answer=1154698 You don’t need to fix crawl errors in Webmaster Tools. The “mark as fixed” feature is only to help you, if you want to keep track of your progress there; it does not change anything in our web-search pipeline, so feel free to ignore it if you don’t need it.
support.google.com/webmasters/bin/answer.py?answer=2467403 We list crawl errors in Webmaster Tools by priority, which is based on several factors. If the first page of crawl errors is clearly irrelevant, you probably won’t find important crawl errors on further pages.
googlewebmastercentral.blogspot.ch/2012/03/crawl-errors-next-generation.html There’s no need to “fix” crawl errors on your website. Finding 404’s is normal and expected of a healthy, well-configured website. If you have an equivalent new URL, then redirecting to it is a good practice. Otherwise, you should not create fake content, you should not redirect to your homepage, you shouldn’t robots.txt disallow those URLs -- all of these things make it harder for us to recognize your site’s structure and process it properly. We call these “soft 404” errors.
support.google.com/webmasters/bin/answer.py?answer=181708 Obviously - if these crawl errors are showing up for URLs that you care about, perhaps URLs in your Sitemap file, then that’s something you should take action on immediately. If Googlebot can’t crawl your important URLs, then they may get dropped from our search results, and users might not be able to access them either.



Redirect the URLs

If you redirect those URLs something other than your home page, they will no longer appear on that report. Redirecting them the home page would Google to label them as "soft 404" errors and Google would still include them in the report.

This is an especially appropriate response for URLs that are broken versions that you can correct. For example if googlemap.php is the correct URL, you should redirect googlemapper.php to it.

Restrict Googlebot with robots.txt

Many of these pages are probably not appropriate for inclusion in a search engine anyway. If you keep Google from crawling the pages, Googlebot will stop finding 404 pages. You could use something like the following in robots.txt to keep Googlebot out:

Disallow: /code/?


Use Google Webmaster Tools Parameter Handling Options

If some of the parameters on these URLs don't actually change the page much, you can tell Googlebot to ignore them. For example your next parameter sounds like it isn't all that important to the text of the page. Maybe it just changes a single button or some navigation. If that is the case you can find settings under "Crawl" -> "URL Parameters" that allow you to tell Googlebot that parameters don't effect the content of the page and that Googlebot should only crawl one representative URL for all values of that parameter.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme