: Do robots.txt and sitemap.xml need to be physical files? I have both setup in my routes: Route::get('/robots.txt', function() { // robots.txt contents here }); Route::get('/sitemap.xml', function()
I have both setup in my routes:
Route::get('/robots.txt', function() {
// robots.txt contents here
});
Route::get('/sitemap.xml', function() {
// sitemap.xml contents here
});
I can access them perfectly through the browser but I'm getting a message from Google Search Console that they are not detected. Do they need to be physical files in the root folder so they can be detected?
More posts by @Jennifer507
1 Comments
Sorted by latest first Latest Oldest Best
It is possible to have a sitemap.xml and robots.txt file in your root directory and return these files as 404 not found to bots.
I recommend that you search google for a header status page checker. Run your sitemap and robots URL through the checker and see what header status is being reported. If the header status is 404 then there is an issue. You need the status to be 200 (ok).
There can be many reasons why a page returns a 404 status despite existing. A lot of the time it has to do with conflicting code. The most likely reason why a page would return 404 despite already existing is that something in your .htaccess file is changing its status.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2025 All Rights reserved.