Mobile app version of vmapp.org
Login or Join
Connie744

: Google displays search results even after removing URLs in Search Console I am developing an AngualrJS application and since it is not easy to optimize it for search engines, I put it public

@Connie744

Posted in: #Google #Googlebot #GoogleSearch

I am developing an AngualrJS application and since it is not easy to optimize it for search engines, I put it public and let Google index the page by registering a "Property" in their "Search Console" to see the results and to improve the page. The page has been successfully indexed and Google does display search results.

Now I wanted to remove the page from public again to finish the project and therefore I wanted that Google to remove it from the search results. However, this seems not to be easy.

First, I adjusted the robots.txt to deny any indexing. Even after Google checked the new robots.txt, the search results were still visible.

Then I tried to setup temporarily URL removals in their search console. They were accepted, but the URLs were not removed.

Then I setup the site such that Googlebot does no more get any access to the server - it gets 404 for every URL requested. This has been the case for almost 4 weeks but Google still displays search results.

Two weeks ago, I deleted the "Property" from the search console of my google account and deleted the webroot validation HTML file of Google (googlexxxxxxxxxxxxxxx.html - well, Googlebot will get 404 anyway) but they still display search results...

How can I get Google to completely and quickly remove my site from all search results?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Connie744

2 Comments

Sorted by latest first Latest Oldest Best

 

@Samaraweera270

You should remove URL in Search console few times and maybe Google will see your request))

10% popularity Vote Up Vote Down


 

@BetL925

Google indexes many sites that don't have a search console account. Deleting the property from search console does not effect whether Googlebot will crawl your site and include it in the search index.

If you disallow crawling via robots.txt Googlebot will never be able to see that those pages are 404. Google needs to be able to crawl to see that you have removed the content.

After you have done the temporary removal, allow crawling again to let Googlebot see the 404 errors. Google should crawl all of your pages within a couple weeks. Pages get removed after 24 hours when Googlebot finds a 404 error. If Google can't crawl and see the 404 errors, it may include URLs in the search index indefinitely.

Instead of serving 404 errors just to Googlebot to attempt to get the pages removed, you should tell Google explicitly that you do not want the pages indexed. The way to do that is to include robots meta noindex tags in head section of every page:

<meta name="robots" content="noindex">


Once you put in those tags, you can safely let Googlebot crawl your entire site and non of it will appear in the search engine.

To speed up the no-index process for all the URLs on your site, Google has a Noindex: directive in robots.txt. Change your robots.txt file to:

User-Agent: *
Noindex: /


Even if you do that, you will need to add the noindex meta tags because Google says the Noindex robots.txt is experimental and may go away at any time.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme