: Friendly URLs: is there a max length for search engines? People from stackoverflow have been working closely with google team to help them make the panda algorithm more efficient, so I guess
People from stackoverflow have been working closely with google team to help them make the panda algorithm more efficient, so I guess they've learned a lot from the google team.
Thus they may have done very clever friendly URLs to maximize the page rank.
I've seen from time to time very long URLs (can't find where) in stackoverflow, but after a certain "amount" of character there were only numbers, kind of "ok passed this length, SEOs will ignore this so let's put only numbers".
I've done a huge work on my framework to make very friendly URLs, and my website can come up with URLs like:
www.mysite.fr/recherche/region/provence-alpes-cote-d-azur/departement/bouches-du-rhone/categorie-de-metiers/paramedical/
It's very long and I'm wondering if the previous URL won't be mixed with, say, this one:
www.mysite.fr/recherche/region/provence-alpes-cote-d-azur/departement/bouches-du-rhone/categorie-de-metiers/art/
More posts by @Hamm4606531
2 Comments
Sorted by latest first Latest Oldest Best
It's not the search engines' place to decide how long URLs should be; they just have to index them.
That said, extremely long URLs with lots of keywords and such might start getting you flagged as potentially suspicious. They don't come up(for me) as much anymore, but think back to when search results for certain things tended to be overwhelmed with matches from eg. ohiorefrigeratorrepairparts.com/kenmore/etc/etc...
On the whole, sure your URLs are "friendly" but they're also trying a bit too hard. You're making the mistake of thinking that your URL structure has to precisely match your site/content structure. Keeping URLs concise is another factor in friendliness. For more on that, see @Litso 's response. Here's a Google Webmaster Help video with Matt Cutts saying a keyword's directory depth has basically no effect, for whatever that's also worth. But I also recall another video which I can't find right now, in which he said that after a certain number of directories deep, the spider stops paying attention. [I can't find this right now; if anyone else has a link(or can refute), feel free to edit.]
From a couple other angles...
The HTTP spec says there is no technical limit on length(section 3.2.1):
The HTTP protocol does not place any a priori limit on the length of a URI.
... and in pretty much any practical situation, the actual browsers will hold up to that unless you really, really force the issue. (Who cares if the search engines handle huge URLs, if the browsers don't.) That's research is ~5years old, but largely holds up.
Friendly for whom exactly? For visitors these urls aren't very friendly.
I'd advice you to keep it a lot shorter, around 6 keywords at the max.
Sources:
support.google.com/webmasters/bin/answer.py?hl=en&answer=76329 www.seomoz.org/blog/11-best-practices-for-urls
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.