Mobile app version of vmapp.org
Login or Join
Berryessa370

: Why does SEO based code tips not appear to affect ranking? I've been researching various methods for SEO where pages have precise titles, keywords are highlighted with h tags and tick the many

@Berryessa370

Posted in: #Google #Seo

I've been researching various methods for SEO where pages have precise titles, keywords are highlighted with h tags and tick the many boxes stated in good page mark up for SEO.

However when looking at some top ranked search sites on google for key terms they have terrible SEO based mark up.
Really long page titles, no tags, limited appearance of keywords in the text and so on.
SEO analysis services rate them lower than other sites, yet these sites rank really high.
Even with a low number of back-links they are high, so I don't understand how these sites earn the position when they appear inferior to those below them which have better mark up and links.

I don't want to cause trouble my mentioning sites or keywords etc. but looking in google at 'executive search' the roughly 5th placed site makes no sense why it should be highly rank, especially with all the added .swfs. The same applies for the top of 'Japan Executive Search'.

My main point is that these sites seem to not have all the important structural rules stated in seo page rating applications and general suggested best practice, nor do they show large back-links.

It makes me feel like there is no point bothering to write decent mark up if it really doesn't matter.

Can anyone explain how sites with such mark-up, and low back-links can outrank well written and structured sites with greater linkage?

Sorry if this is a fuzzy question, I want to avoid singling out any sites for example, but it really has me perplexed that sites which appear to ignore the suggested best practices rank so well.

10.05% popularity Vote Up Vote Down


Login to follow query

More posts by @Berryessa370

4 Comments

Sorted by latest first Latest Oldest Best

 

@Murphy175

All the above answers are great, so I'm very grateful for them, and all are relevant.
I went and educated myself better, researched and found the reason.

The factors that appeared to enable the badly coded sites with less links to rank higher was in this case seemingly down to two factors.


The age of the site.
The quality and not quantity of the back-links.


The top ranking sites for the keywords all happened to feature in a huge single page site dedicated to that topic from the late 1990s.
In that site it references the top ranking sites, and passes them a lot of reputation, as the linking site has age, is unaffiliated and very specific to the content.

Therefore in the battle for high quality back-links a small group of sites which have been around a long time, were getting boosted significantly by this site over the rest.

The badly coded number 1, was also picking up more quality from a Chamber of Commerce site, which passed a large amount of reputation.

The deciding factor I believe is age, and the aged back-links from a specific and highly reputed site. As this site is effectively abandoned there is also no way for new sites to get themselves linked from it.

When I checked out the back-linking of the top end competition, it showed that a few high reputation back-links were creating the difference, and high rep, content specific, aged back-links are a tricky thing to acquire.

10% popularity Vote Up Vote Down


 

@Odierno851

It's important to recognize that just because some sites ranking well do not use "SEO" best practices (such as those from Google's SEO Starter Guide) does not mean these techniques are ineffective and a waste of time.

Keep in mind that Google (and most other search engines) use hundreds of factors when evaluating the relevance of a website for a query. Among them are factors that are based on the content that search engines are able to extract from your pages. The best practices for on-page "SEO" generally aim to make extracting information from your pages as easy and as straight-forward as possible. Search engines may be able to find your page's heading when it's hidden away in a Flash file, but they will definitely be able to find it much easier (and often value it accordingly) if it's text within an h1-element. The same applies to many of the other factors: the goal is to make it as easy as possible for search engines to recognize the relevant information. Doing that consistently makes sure that at least some of those "hundreds of factors" are covered.

With regards to clean markup there's also an indirect element involved: many times clean markup renders better across browsers and devices (phones, tablets, netbooks, etc) than "spaghetti-markup." Search engines may not notice that (though you'll see it in Google's Instant Previews, for example), but users might. If users are faced with a site that doesn't work well for them, they're not going to recommend it (by linking, sharing, like'ing, or +1'ing) to their friends. You can double-check rendering of your content across browsers using a service like browsershots.org.

10% popularity Vote Up Vote Down


 

@Lee4591628

SEO rises from three main points: markup, quality content, and relations.

Too much focus on only one aspect doesn't ensure your desired boost. Balanced focus on all bases does.

There are no bullet-proof ways to rank up, but experience tells me that having good relations (backlinks from other good websites - poor performance, low traffic websites doesn't count), and fresh and quality content, have fastest results.

Markup plays two roles. Easy crawling and indexing your content - also it demonstrates that you do care about writing good pages - and serving as the ultimate deciding factor.

Suppose you and your rival each publish 2 articles weekly. Both awesome. You and your rival also are well connected, being recommended by almost same websites. You both are on the position to be draw. Who facilitates indexing and crawling win?

Note that not only HTML code goes under crawlability. It also passes through good design (yes, Google, yahoo and Bing recognize breadcrumbs, for example), and serving easy URLs for example.

And what if title, headings, asides, breadcrumbs, forms... everything is alright between you and your rival? Who uses semantics (RDF, microformats)? Who have the best accessibility?

10% popularity Vote Up Vote Down


 

@Moriarity557

Maybe the "seo page rating applications" you mentioned don't actually know how Google's currently ranking pages? Have you considered that they might be just re-hashing the same information that's been floating around for years in the hope people will pay for their software?

The "general suggested best practice" on SEO is mainly about making it a bit easier for Google to find, index, and process your pages; only Google knows how it's actually then ranked. Unfortunatley, you'll find many of the so-called "SEO Experts" are in fact just selling snake-oil, which sometimes appears to work, but other times doesn't.

Google's official guide should tell you most of what you need to know. The rest is mostly a case of good content that people actually want to read, mixed with a sprinkling of luck....

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme