Mobile app version of vmapp.org
Login or Join
Pierce454

: Google Search Console: Severe health issues, some important page is blocked by robots.txt I have just added a new property to the Google search console. I've added it as http://example.com and

@Pierce454

Posted in: #GoogleSearchConsole #RobotsTxt

I have just added a new property to the Google search console. I've added it as example.com and www.example.com.

I receive the following error message: Severe health issues are found in your property. Is robots.txt blocking important pages? Some important page is blocked by robots.txt.

The "some important page" links to the homepage. The error only appears for example.com and not example.com. Everything with DNS is resolving and there are no issues. The redirect from www works as well. Google Fetch and Render works as well.

I have removed my robots.txt file, but the issue remains. The robots.txt I was previously using:

User-agent: *
Disallow:

Sitemap: example.com/sitemap.xml

I cannot share the domain which I appreciate makes things more difficult to debug. But any reason why I receive an error message when there is no robots.txt file to block Google from crawling?

EDIT:

The crawl errors: "Google couldn't access your site because of a DNS error," Lookup error: "Your DNS server did not recognize your hostname." and Total DNS errors: "Couldn’t communicate with the DNS server." persist. I have used the Fetch as Google tool and it fetches and renders the site with no issues. I have used the robots.txt Tester and there are no issues there either. I have also checked to see if Google has cached the site, which it has. I have been in contact with my DNS provider and they have assured me that there are no issues on there side. I can tell from the Site Errors graph that it was updated yesterday and it shows Errors/Attempts - 0/1. I've never had this happen before on a completely new property. Any suggestions?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Pierce454

2 Comments

Sorted by latest first Latest Oldest Best

 

@Voss4911412

There is no need to write url to your sitemap in the robots.txt file, I suggest you remove it from robots.txt and post it on google search console (Webmaster Tools), also this isn't the correct syntax for robots.txt if you are using Disallow you tell bots to not index a webpage, and when leaving it empty this might cause problems in my opinion.

If you feel comfortable with user-agents and crawlers to index every page of your website use User-agent: *
Allow: /

And if you want to disallow user-agents and crawlers to index certain pages, do it like that :

User-agent: *
Disallow: /admin/
Disallow: /search/
Disallow: /login/
Disallow: /register/

10% popularity Vote Up Vote Down


 

@Heady270

Google Search Console has a built in tool for testing your robots.txt file: robots.txt tester:



Use that tool to figure out more about what is causing the problem. My guess is that Googlebot isn't seeing the correct version of robots.txt. It may be using an out of date version. DNS may not have propagated yet and it is still seeing the parked domain robots.txt.

It will also allow you to test URLs and highlight any rule that is causing the page to be disallowed.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme