: Can internal linking create excessive web server load? SE sites allow tags chaining like this: /questions/tagged/seo+google+search-engines+research+keywords This site has more than 700 tags. If up
SE sites allow tags chaining like this:
/questions/tagged/seo+google+search-engines+research+keywords
This site has more than 700 tags. If up to 5 tags can be chained, there are
700*700*700*700*700 = 168070000000000
links that google can crawl. Can it be harmful?
More posts by @Odierno851
2 Comments
Sorted by latest first Latest Oldest Best
Can internal linking create excessive
web server load?
Not unless your server is already struggling - Googlebot (unlike many poorly-conceived spiders and some recursive downloading agents) throttles requests and even allows webmasters to specify how often Googlebot should visit pages via Google Webmaster Tools (if your server is already struggling).
Can it be harmful?
Yes, because there are plenty of poorly-conceived spiders and recursive downloading agents out there which can bring your server to its knees if you've created interlinked quagmires (particularly so if these are resource-intensive dynamically-generated pages).
I am having a hard time seeing how your site's users benefit from this feature if there are hundreds of tags.
Wouldn't a site search feature be better-suited to users' needs?
Why is there a need to map up to five categories' union to a unique URI?
My thought is that if Google sees a tag with the url: twicks-website.com/posts-with-awesome-tag
it will visit that url only once when it is indexing, if it sees it multiple times on the site, there's no need to visit it again in that indexing session. It WILL visit and index each of the 700 tags, but there's no reason it should visit a tag more than once... it would overload your server and cost Google more money.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.