Mobile app version of vmapp.org
Login or Join
Marchetta884

: SEO - Pages are blocked because google failed to get the resource since it is blocked by robots.txt I would like to index all the pages in my angular site by Google. I used ngMeta in my

@Marchetta884

Posted in: #Googlebot #GoogleMaps #RobotsTxt #Seo #WebCrawlers

I would like to index all the pages in my angular site by Google.

I used ngMeta in my application for dynamically apply page titles and description. Is the dynamic meta tags really a problem? When I search my pages in google, it won't show my dynamic page descriptions.

I have some dynamic URLs also.
For example
www.example.com/coaches/co/denver/soccer-lessons
In this URL, co, denver and soccer are dynamic values. I would like to index those URLs also. I have created and submitted the sitemap file with all the possible URLs. Also, I submitted manually all the possible URL combinations in Google's Web master tools under Fetch as Google option.

Also, I am using google maps APIs, hotjar, facebook pixel and google analytics on our site.

Google's webmaster console says that Googlebot couldn't get all the resources for this page.


[AuthenticationService.Authenticate][2] is blocked by robots.txt


I am not sure whether this is the issue, but among 115 pages submitted only 4 indexed successfully by Google.

Can anyone help? Thanks in advance for your help!

10% popularity Vote Up Vote Down


Login to follow query

More posts by @Marchetta884

0 Comments

Sorted by latest first Latest Oldest Best

Back to top | Use Dark Theme