Mobile app version of vmapp.org
Login or Join
Jamie184

: Keyword Densities - do they still matter? Let's say I have a standard issue web page with 500 - 600 words and I'm priming my page for two specific keywords (for the purposes of this example).

@Jamie184

Posted in: #Seo

Let's say I have a standard issue web page with 500 - 600 words and I'm priming my page for two specific keywords (for the purposes of this example). We know that these words need to feature within the context of the page but how many times should they be used in the body text? How many times is too many? How few is too little? Does it even make any difference any more? Conventionally I have learned and used 2 to 7 % density and typically I've seen SEO organisations use roughly 4 % density for two word phrases.

If I want this page to perform well in the search engines what are the best practises in relation to densities? What strategies have worked for you on real world projects?

10.04% popularity Vote Up Vote Down


Login to follow query

More posts by @Jamie184

4 Comments

Sorted by latest first Latest Oldest Best

 

@Sherry384

Here is an updated view for 2014 with a brief historical explanation.

In the early days of search engines, Google seems to be the focus here and so I will speak to Google, the methods of determining the site and page subject was to evaluate several things from the links to a page, the title tag, the header tags primarily the h1 tag, for a period the description meta-tag, and the content particularly the top about 200k (or so no one really knew for sure) of text for clues. Later, the entire text was examined. The only real mechanism was to measure the number of words found on a particular page and HTML elements and weigh them accordingly to determine a topic by recognizing words, phrases, and associating them with other uses to identify the topic. With this, the strength of the supposed topic was weighed using the literal count of the recognizable topic keywords and phrases. But since Google used machine learning methods, anything not easily recognizable would be simply weighed by word count at the very least. This was a good method for a time.

Then came the various methods of gaming this system.

In 2008 with the introduction of Scholar, Google realized the value of citations and began to recognize the power of hidden connections in documents not recognized before. This caused a shift in how Google would ultimately recognize topics and value of content. Scholar, as it existed, was moved into the regular Google algorithms and a shift from traditional weighting to something more analytical focusing on relationships and recognizable patterns in context began to take shape. This was a process that evolved and improved significantly over a number of years.

There is an argument over whether keyword density was ever really a factor thus raising the notion that keyword density was a myth. Both sides of the argument had merits. For a while, it was necessary to maintain a certain keyword density to compete, but too much density was 1: easily recognized for what it was, and 2: counter productive yielding in too little gain if any. The magic number has been argued of course. For a keyword to achieve an appropriate density to properly be recognized as having value would vary radically depending upon the number of words on the page leaving open speculation as to what density was optimal for any word. The fact of the matter was, that for any word to achieve the recommended percentage density (according to SEO experts at the time), the word would have to used an inordinate number of times. The longer the page, the more ridiculous the word count would have to be.

But now the game has changed.

After much earned criticisms and some not so well deserved, Google found it's way into removing keyword density as a factor though some elements of the original algorithm will seem familiar.

Keyword density is no longer a factor though keywords remain important. Instead of density, usage of keywords in links, title tags, h1 tags (as well as other header tags), and throughout content has become important. But because citation indexing is more prevalent and has matured to such a high state, a more natural writing of the content is now required. Each word is evaluated within context to other words, some by proximity and others by relational word/phrase indexing, so that the topic and value of a page can be better determined if and only if the content was written to be read by humans and not evaluated by machines. Google moved from evaluating content like a machine exclusively with mathematical algorithms to evaluating content closer to how humans actually evaluate content which was the goal. In short, these days, even if you were to get away with it, keyword density actually has no place in how content is evaluated anymore. Sure there are some gotchas and bumps along the road. No system is perfect. But it is improving.

10% popularity Vote Up Vote Down


 

@Lee4591628

In general terms keywords density DOES matter for Google ranking, you can easy test it by creating two identical pages indexed exactely once from the same page (just to let Google know they exist and index them).
Then in one you place tow/three times more the keyword/keyphrase, than in the other one. You will see the 1st one will rank high maybe just one position, but still high.

Keywords stuffing does NOT matter in my own experience, and even if it matters it's bad SEO practice so it might result in penalizing your site, why risking that?!

10% popularity Vote Up Vote Down


 

@Mendez628

Keyword stuffing has been abused so much that Google will penalize sites where keywords are repeated too much, tracking keyword density is not useless - it is actually harmful.


The most valuable locations on page are the page title and the h1 tag - make sure to use the keywords there.
Repeat the keywords in the page text maybe 2-3 more times.
Use synonyms - this will both prevent Google from penalizing you for keyword stuffing and get you some more "long tail" searches.
Write for humans not for search engines - after all, the whole point of getting someone to the site is to get that someone to take some action - don't get so obsessed about SEO that you forget that (SEO-heavy text is likely to make people immediately reach for the back button)

10% popularity Vote Up Vote Down


 

@Angie530

Keyword density is close to irrelevant, as evinced by the success of so-called Google bombing across most major search engines - links (and links' anchor text) matter more than on-site content; it doesn't hurt to use the keywords you want to rank for across your site, but the density is far less important than incoming links.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme