: Is it practical to generate a million page sitemap? I was recently asked to create a sitemap for an older Apache website with over 20,000 generated database pages. However, halfway through
I was recently asked to create a sitemap for an older Apache website with over 20,000 generated database pages. However, halfway through the project I found out that the client neglected to mention that the total inventory was actually around 12 million+.
The client wouldn't listen to my concerns about the impracticality of a sitemap that large, and kept insisting that he needed to catalog all of the pages to get more page views. He kept citing the fact that it is possible to create multiple sitemaps, and that the max limit for a single sitemap is 50,000 pages.
I tried out several freeware sitemap generators, and the best option ran out of memory and crashed at 20,000 pages. Most commercial sitemap generator products also tended to have a maximum capacity of 5,000. I could only find one product claiming to be capable of crawling more than 1 million pages, but I wasn't being paid enough to spend months waiting for a crawler to slog through 12 million pages.
I cancelled the job and tried providing the client with some alternative SEO resources that would do a lot more to increase page views, but it really bothers me that I was caught so off guard by a freelance request. I try to specialize in SEO and WordPress maintenance, and I'm accustomed to handling larger sites with a few hundred pages, or even a few thousand pages. In WordPress it usually takes me less than 10 minutes to generate a sitemap.
I did try researching the subject further, and apparently it is possible to program custom script that will crawl and catalogue over a million pages. It doesn't really seem to be a productive use of time or resources, nor is it particularly beneficial to SEO. But it has been done before.
I guess I am just curious to know if it is common in the web industry to encounter websites requiring massive sitemaps over 20k? Or was I just too inexperienced to deal with the programming requirements? 12 million seemed like an unreasonable number to me, and I couldn't determine any advantages such a sitemap would have for the website in question.
More posts by @BetL925
1 Comments
Sorted by latest first Latest Oldest Best
You should submit a sitemap of all pages you believe should be in
Google's search index.
If you have millions of pages, you'll need to use the sitemap index
which is a collection of individual sitemap files.
sitemaps.org/protocol.php
Each sitemap must have no more than 50,000 URLs and no larger than
10MB.
To get the most out of this protocol I'd suggest creating sitemaps
that map to a category or page type (or combination) so you can
determine indexation rates. You often find that some pages or
categories are better indexed than others. It's then your job to
figure out why.
www.quora.com/If-I-have-a-website-with-millions-of-unique-pages-should-I-submit-a-partial-sitemap-to-Google
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.