: Sitemap structure for network of subdomains I am working on a project that's a network of 2 domains (domain.gr & domain.com.cy) of subdomains similar to Hubpages [each user gets a profile
I am working on a project that's a network of 2 domains (domain.gr & domain.com.cy) of subdomains similar to Hubpages [each user gets a profile under a different subdomain & there is a personal blog for that user as well] and I think there is something wrong with our sitemap submission. Sometimes it takes weeks in order a new profiles to get indexed. We make use of one Webmasters account in order to manage all network and we don't want to create different accounts for each subdomain since there are more than 1000 already.
According to this post goo.gl/KjMCjN, I end up on a structure of 5 sitemaps with the following structure :
1st sitemap will be for indexing the others.
2nd sitemap for all users profile under the domain.gr
3nd sitemap for all users profile under the domain.com.cy
4th sitemap for all posts under the *.domain.gr - news sitemap goo.gl/8A8S9f
5th sitemap for all posts under the *.domain.com.cy - news sitemap
again
Now my questions:
Should we create news sitemaps or just list all post in 2nd & 3rd sitemap?
Does links ordering has anything to do? Eg: Most recent user created be first in sitemap or doesn't make any different and we just need to make sure that lastmod date is correct?
Does anyone guess how Hubpages submit their sitemap in Webmasters so maybe we could follow there way?
Any alternative or better way to index this kind of schema?
PS: Our network is multi language - Greek & English are available. We make use of hreflang tags on the head of each page to separate country target of each version.
More posts by @Karen161
2 Comments
Sorted by latest first Latest Oldest Best
A sub-folder is much better for SEO. Just read this thread for example.
However, if you are finding it takes a long time to get your pages/sub-domains indexed, is it a problem with the authority of your website in general? E.g. does Google trust your domain/crawl the site often.
What I'm trying to say is
Determine whether Google visits your website often/trusts you by using tools such as Google Webmaster Tools and look at the crawl report section and look at how often it comes back to your website.
Consider moving to using sub-folders instead of sub-domains and 301 redirect the sub-domains on the go-live date. Google has said that it tries to treat sub-domains as the same website as the primary domain, but this isn't necessarily the case. Ideally, if you trust the content then have it as a sub-folder.
For point 2, the perfect example is blogspot. All of the blogs created on this are sub-domains and not sub-folders and are treated by Google as their own authority. They aren't treated better than other websites because they're on blogspot. Them being on a sub-domain is why.
To avoid the need to submit multiple sitemaps for each subdomain with different accounts etc, you should have implemented /folders instead of .subdomains. It is also much more efficient when it comes to SEO.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.