Mobile app version of vmapp.org
Login or Join
Michele947

: Firebase hosting, sitemaps and robots.txt i tried searching on internet about this question and how Firebase handle indexing and how owner of the content running on Firebase hosting can influence

@Michele947

Posted in: #Firebase #Googlebot #Seo

i tried searching on internet about this question and how Firebase handle indexing and how owner of the content running on Firebase hosting can influence crawling by Google bot.

Personally, i am running website built with Polymer + Firebase as backend (so basically SPA) with like 150 visitors per week and my website has common navigation with 5 links:


/price-list
/contact
/projects
/about-us
/home


Also projects page has nested


/projects/interior
/projects/architecture


Google indexed my website far more than it should and currently results like this appear when googling

/projects/interior/[[id]]/[[name-of-interior-project]]

So back to my question: How can i influence Google bot to forbid crawling specific sites? It is enough to just upload robots.txt or sitemap file along with other files like i would normally do with common hosting?

Thanks!

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Michele947

1 Comments

Sorted by latest first Latest Oldest Best

 

@Reiling115

Yes, you can block access to specific urls with robots.txt rule. If the page has real, unique url and google is indexing it, it doesn't matter that it's a SPA and by default links are loaded without page refresh.

Disallow: /projects/interior/


should match /projects/interior/[id]/[name] but not /projects/interior

You can test robots.txt rules with Google Search Console to be sure.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme