Mobile app version of vmapp.org
Login or Join
Alves908

: Parts of my site are not getting indexed after changing the URLs, adding redirects, and blocking the old URLs in robots.txt I have a website with some URLs and later I optimized the URLs and

@Alves908

Posted in: #Google #GoogleIndex #GoogleSearchConsole #Indexing #RobotsTxt

I have a website with some URLs and later I optimized the URLs and changed the link structure.

I blocked the old URLs with robots.txt and suggested redirections from old to new URLs, and then did fetch and render, but some of the pages are not getting indexed in Google.

Recently updated all the meta tags of the site and did fetch and render for that before a week but it shows partial status and only two of links for which meta tags are updated got indexing and the other links are getting indexed with old meta tags.

As I already said that some of the links are not at all getting indexed, the same problem is arising even after meta tags have been updated.

Is it because of blocking the old URLs in robots.txt or something else?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Alves908

1 Comments

Sorted by latest first Latest Oldest Best

 

@Gonzalez347

I blocked the old URLs with robots.txt and suggested redirections from old to new URLs


Search engines will not be able to follow the redirects because you're blocking the old URLs with robots.txt.


Fetch as Google shows partial status


There can be many reasons.


Blocked resources. Sounds likely in your case.
Blocked external resources. Very common, but usually not an issue. If an external resource is blocked, there's nothing you can do about it, but there usually isn't any need to fix them either.
The page is too big for Google to handle at the moment. Google might be saving their resources for more important things if your pages are heavy.
Slow server causing timeouts.



and only two of links for which meta tags are updated got indexing


The fetch as Google tool is primarily a diagnostics tool, you should not have any expectations of getting pages indexed by using that tool.

Probable causes behind this is that you're preventing Googlebot from crawling, check your robots.txt. Or that the content is of poor quality.


and the other links are getting indexed with old meta tags.


No, they were already indexed, they didn't get indexed again.


As I already said that some of the links are not at all getting indexed, the same problem is arising even after meta tags have been updated.


Updating meta tags is not a magic solution. Ducytape is though, but you can't ducttape a site.




Is it because of blocking the old urls in robots.txt or something else?


The former.

A few things about robots.txt that you should know.


/robots.txt is used to prevent crawling, not indexing.
/robots.txt is not a magical solution.
There's rarely any need to block crawling. Exceptions are for situation when search engines would find an infinite amount of pages, e.g. a calendar.
It does absolutely nothing for security.


Sounds like it is a pretty useless file? It is, just delete it or replace its content with

User-Agent: *
Disallow:

Sitemap: example.com/sitemap.xml

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme