: How to improve google search index status for my domain? I have a robots.txt for my google-hosted (gae) webapp with the following contents User-agent: * Disallow: I also have a sitemap.xml.gz
I have a robots.txt for my google-hosted (gae) webapp with the following contents
User-agent:
* Disallow:
I also have a sitemap.xml.gz (currently 49.3 kB) that is generated by a python function.
class SiteMap(webapp2.RequestHandler):
def get(self):
days = 60
url = (os.environ['HTTP_HOST'] if os.environ.get('HTTP_HOST'
) else os.environ['SERVER_NAME'])
i =
'<?xml version="1.0" encoding="UTF-8"?><urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" >'
for ad in
Ad.gql('where published = True and modified >:1 order by modified desc'
, datetime.now() - timedelta(days=days)):
i =
'%s<url><loc>http://%s/vi/%d.html</loc><lastmod>%s</lastmod><changefreq>daily</changefreq><priority>0.8</priority></url>'
% (i, url, ad.key().id(),
#filters .slugify(ad.title),
ad.modified.date())
i = '%s</urlset>' % i
self.response.headers['Content-Type'] = 'gzip'
self.response.headers['Content-Length'] = str(len(i))
self.response.out.write(compressBuf(i))
GWT says there are "no maual actions needed for junk removal" for the content but still I think that it doesn't index as much as it should looking at the graph for what's indexed and why the drops?
Can you tell me why the index fluctuates the way it does on the graph and if there is something I can do to improve PR and/or search index status? I don't target any spec keywords but I just want google to index everything.
More posts by @Steve110
1 Comments
Sorted by latest first Latest Oldest Best
This seems to be the way Google works. You cannot speed up Google and patience is something even an already patient man learns.
Google will fetch a few pages to test download speed and then fetch a larger amount for a period. It seems based on what I have seen over the years that Google will fetch in chunks as great as about 40,000 - 50,000 pages per day under normal circumstances until the bulk of the pages are read. From there, it will begin to taper off less and less and seem to take forever and drive you nuts but it will finish. It seems that there is an algorithm in how this is done, but there is a lower limit too. This means that for indexing larger sites, the tailing off takes a bit longer, but it will get done I promise.
You have done what you can. You submitted a sitemap. I assume that your site can be crawled and downloads at a reasonable time. About the only thing you can do is wait.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.