: SEO - Pages are blocked because google failed to get the resource since it is blocked by robots.txt I would like to index all the pages in my angular site by Google. I used ngMeta in my
I would like to index all the pages in my angular site by Google.
I used ngMeta in my application for dynamically apply page titles and description. Is the dynamic meta tags really a problem? When I search my pages in google, it won't show my dynamic page descriptions.
I have some dynamic URLs also.
For example
www.example.com/coaches/co/denver/soccer-lessons
In this URL, co, denver and soccer are dynamic values. I would like to index those URLs also. I have created and submitted the sitemap file with all the possible URLs. Also, I submitted manually all the possible URL combinations in Google's Web master tools under Fetch as Google option.
Also, I am using google maps APIs, hotjar, facebook pixel and google analytics on our site.
Google's webmaster console says that Googlebot couldn't get all the resources for this page.
[AuthenticationService.Authenticate][2] is blocked by robots.txt
I am not sure whether this is the issue, but among 115 pages submitted only 4 indexed successfully by Google.
Can anyone help? Thanks in advance for your help!
More posts by @Marchetta884
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.