: Why do many valid links in my blog show error 500 in Google WebMaster Tools? Majority of the links in the screenshot below are valid! Such as: http://wayneye.com/Blog/HTML5-Web-Socket-In-Essence
Majority of the links in the screenshot below are valid! Such as:
wayneye.com/Blog/HTML5-Web-Socket-In-Essence wayneye.com/Blog/There-Is-No-The-Hope-Project
And so on... Even if I tried "Fetch as Googlebot" the result is 200 OK.
Please note: I do not block any content for google bot in my robots.txt, neither in my HTML meta tags.
More posts by @Shelley277
1 Comments
Sorted by latest first Latest Oldest Best
Possible causes for the 500 error you're seeing:
Your robots.txt file starts with an illegal character. When validated here, the file contains a character that's hidden when viewed as UTF-8 in a text editor such as TextMate, but visible when viewed as ISO-8859-1 (see image below). To fix this, delete the first line, retype it, save the file, and revalidate it.
Your site may have simply returned a 500 error when Google last crawled it. (i.e. The server was down.) If this is the case, the errors you should not be repeated when Google next crawls your site, assuming it stays up.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.