Mobile app version of vmapp.org
Login or Join
Jennifer507

: Do robots.txt and sitemap.xml need to be physical files? I have both setup in my routes: Route::get('/robots.txt', function() { // robots.txt contents here }); Route::get('/sitemap.xml', function()

@Jennifer507

Posted in: #Files #GoogleSearchConsole #RobotsTxt #XmlSitemap

I have both setup in my routes:

Route::get('/robots.txt', function() {
// robots.txt contents here
});

Route::get('/sitemap.xml', function() {
// sitemap.xml contents here
});


I can access them perfectly through the browser but I'm getting a message from Google Search Console that they are not detected. Do they need to be physical files in the root folder so they can be detected?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Jennifer507

1 Comments

Sorted by latest first Latest Oldest Best

 

@Chiappetta492

It is possible to have a sitemap.xml and robots.txt file in your root directory and return these files as 404 not found to bots.

I recommend that you search google for a header status page checker. Run your sitemap and robots URL through the checker and see what header status is being reported. If the header status is 404 then there is an issue. You need the status to be 200 (ok).

There can be many reasons why a page returns a 404 status despite existing. A lot of the time it has to do with conflicting code. The most likely reason why a page would return 404 despite already existing is that something in your .htaccess file is changing its status.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme