: Firebase hosting, sitemaps and robots.txt i tried searching on internet about this question and how Firebase handle indexing and how owner of the content running on Firebase hosting can influence
i tried searching on internet about this question and how Firebase handle indexing and how owner of the content running on Firebase hosting can influence crawling by Google bot.
Personally, i am running website built with Polymer + Firebase as backend (so basically SPA) with like 150 visitors per week and my website has common navigation with 5 links:
/price-list
/contact
/projects
/about-us
/home
Also projects page has nested
/projects/interior
/projects/architecture
Google indexed my website far more than it should and currently results like this appear when googling
/projects/interior/[[id]]/[[name-of-interior-project]]
So back to my question: How can i influence Google bot to forbid crawling specific sites? It is enough to just upload robots.txt or sitemap file along with other files like i would normally do with common hosting?
Thanks!
More posts by @Michele947
1 Comments
Sorted by latest first Latest Oldest Best
Yes, you can block access to specific urls with robots.txt rule. If the page has real, unique url and google is indexing it, it doesn't matter that it's a SPA and by default links are loaded without page refresh.
Disallow: /projects/interior/
should match /projects/interior/[id]/[name] but not /projects/interior
You can test robots.txt rules with Google Search Console to be sure.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.