: Google stopped indexing my site. Is there a substitute for a XML sitemap? On my site is a page that hosts all my ad entries. Each URL and its content is different. Google was indexing all
On my site is a page that hosts all my ad entries. Each URL and its content is different. Google was indexing all the different URLs until entry 4570. As I can see in GWT Google Index they also stopped crawling the new entries (70,000) at this moment.
I would like to understand why Google stopped. I added a sitemap at about this time. The sitemap generator doesn't produce the single ad URLs.
I had to change a page name and put a redirect in my .htaccess file:
Redirect permanent /aerzte/ www.example.de/arzt/
Google stopped about one week later to add new URLs. Can one of above be the reason?
Is there a different solution - without sitemap - to get Google to index these URLs?
P.S.: I can add URLs with Google Fetch and these URLs are immediately indexed.
More posts by @Yeniel560
2 Comments
Sorted by latest first Latest Oldest Best
Create an account at Google Webmaster Tools. In about a week they should be able to tell you exactly where the problem pages for the site are under the status for the domain in question.
They'll say things like "Googlebot couldn't access your site." or "Too many redirects." and they'll also list the 404 errors they receive.
If you've made major changes to your site, then they'll actually queue the site for recrawling and they'll start all over (but it takes a couple of weeks for the changes to be reflected). They'll also indicate their crawl rate (which you should be able to change).
Update
Something most people don't know is that Google will still index your site based on your internal links (and offsite links) even if you don't supply a sitemap. One of my sites is on page one for a search out of 1.8m links and I have no sitemap on it.
Without seeing your site, it sounds like you are getting hit with a low-quality content penalty and that's why Google is no longer indexing you. Adding URLs with Fetch will work in the short run but if the penalty is the cause then I would expect you to see those URLs being dropped out over over time.
Other problems may be indicated by the size of your sitemap file (70,000 entries). Individual XML files should be capped at 50,000 entries and if your file contains more than that, Google may be petulantly refusing to index. One way to be sure is to check the server's access log for requests for the site map and see if the Googlebot is still making requests for it.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.