Keyword Density Checker

 


Calculating the ideal keyword density percentage for SEO

What should the keyword density be?

There is no optimal or universal keyword density percentage. Every search query is unique, so search engines compare (or normalise) documents against other top-performing documents in order to set a threshold for them. Some keywords, for example "credit card", will naturally appear as two-letter phrases, while other terms will be more common. In addition, some highly credible sites with high visibility, usage data and strong link profiles may have a higher recurrence rate than smaller, less credible sites.

As a general rule, the frequency of keywords...

from a trusted internal content corpus (e.g. an internal site search or a selected database of known trusted content authors), with the higher the better.

For a broad corpus of external content (such as a general web search where many people are motivated to try to game the system), a lower value is usually better.

Google Web Classifier

When Google released the first Penguin update in April 2012, it also introduced some page classifiers that penalised some pages with too many repeated words.

Lazy and uninformed cheap outsourced writing tends to be quite repetitive - partly because people who pay per word for cheap content have an incentive to add words, no incentive to cut fat and no incentive to do thorough research. Google's disclosed guidelines for remote reviewing state that reviewers should rate content poorly, with little information and repetitive content.

Nowadays, the main purpose of such analysis tools is not to increase keyword density, but to reduce the focus on keywords by including alternative word forms, chronological words, synonyms and other auxiliary terms.

High density. The benefits of aggressive repetition (in terms of improving rankings for keyword terms) are rather small, and high keyword density increases the likelihood that pages will be filtered out.

Low density (with variations). The benefits of higher word variation (in terms of improving rankings for different related terms) are significant, but low keyword density reduces the risk of pages being filtered.

The video on the right shows how to optimise the on-page SEO strategy, both in terms of conversion rates and the integration of keyword variations in the content.

Good vs. optimal vs. too high keyword density

Pure density is a rather poor indicator of relevance due to the presence of spam.

Early/early search technology was not very sophisticated due to hardware and software limitations. These limitations forced early search engines, such as Infoseek, to rely heavily on page titles and other document ratings on the page to assess relevance. Over the last 15 years, search engines have become more powerful because of Moore's Law. This has allowed them to incorporate additional information into their relevance assessment algorithms. Google's biggest advantage over earlier competitors is its analysis of link data.

Other ranking factors

Search engines can give considerable weight to domain age, site authority, link anchor text, localisation and usage data.

Each search engine has its own algorithm for determining the weighting. These are different for each major search engine.

Each search engine has its own vocabulary system to help it understand the relevant terms.

Some may give more weight to the above factors related to the whole domain and beyond, while others may give slightly more weight to the content of the page.

The title of a page usually carries more weight than most other text on the page.

Meta keyword tags, comment tags, and other slightly obscure appendices may be given less weight than page text. For example, most major hypertext search engines do not give any weight to meta keyword tags.

Page text that is bolded, referenced, or in the headline may have more weight than plain text.

Weighting is relative.

If the whole page is H1 tagged, it will look a bit out of place and will not add any weight to the text because the whole page copy is H1 tagged.

Actions such as bold H1 text should probably be avoided as it is doubtful that it makes the page look more important.

An excessive focus on density is flawed in a number of ways.

When people focus too much on density, they tend to write content that people don't want to read or link to.

Many queries are random in nature. Around 20-25% of search queries are unique. When webmasters modify page copy to arbitrarily increase density, they often remove modifiers that help make the page relevant to many three-, four-, five- and six-word search queries.

Semantically related algorithms may take secondary terms into account when determining the relevance of a page. If you remove your target keyword phrase from the text of your page, can search engines still easily mathematically model what that phrase is and how your page relates to secondary text? If this is the case, your rankings will be more consistent and you are likely to rank for a wider range of related keywords.

Get a competitive advantage today

Your biggest competitors have been investing in their marketing strategy for years.

Now you can find out exactly where they rank, pick your top keywords and track new opportunities as they arise.

Find out your competitors' Google and Bing rankings with this free tool today.


LihatTutupKomentar