Glossary · Letter K

Keyword Density

TL;DR. Keyword density is the percentage of times a target keyword appears in a page's total word count. It was a useful proxy in the early 2000s. It...

What is Keyword Density?

Also known as: KW density, Keyword frequency

What is keyword density?

Keyword density is the percentage of times a target keyword or phrase appears in a webpage's body content, divided by the total word count. It is the oldest on-page SEO metric still in active use. Marketers in 2002 treated it as a primary ranking lever. Most modern SEOs treat it as a hygiene check.

The metric answers a single question. How often does the page mention the term it wants to rank for, relative to the rest of the copy? A 1,000-word article with 10 mentions of "keyword density" has a 1 percent density.

Density is a frequency ratio. It is not a quality signal. Google has not used raw frequency as a primary ranking factor in over a decade.

How to calculate keyword density

The formula is simple. Three inputs. One percentage out.

Keyword density = (keyword occurrences / total words) × 100

Run the count over the visible body text only. Skip the navigation, footer, and boilerplate. Most SEO crawlers pull the main content block automatically.

Worked example. A 1,200-word product page mentions "running shoes" 14 times. Density equals (14 / 1,200) × 100, or 1.17 percent. That is inside the typical 0.5 to 2 percent range most ranking pages occupy.

Three counting rules trip up first-timers:

  • Count exact matches and close stems. "Running shoe" and "running shoes" both count.
  • Plural and possessive forms count once each.
  • Anchor text in internal links counts. Image alt text counts. URL slugs do not.

Most teams use a tool. Yoast, Surfer, and the free SEO Review Tools density checker all return density figures plus a flag if the page sits outside the typical band.

Why keyword density used to matter (and why it doesn't much anymore)

Keyword density was a real ranking signal in early search. Google's first PageRank algorithm, AltaVista, and Inktomi all weighted lexical match heavily. Pages with the target term repeated five or six times beat pages with two or three. Marketers caught on. SEO consultants in 2003 published "ideal" density ranges between 2 and 5 percent.

The arms race ended fast. Search engines began penalizing obvious repetition. Three Google updates closed the door:

UpdateYearWhat it changed
Florida2003First major demotion of keyword-stuffed pages
Panda2011Site-wide quality signal, demoted thin and over-optimized content
BERT2019Bidirectional language model, started reading meaning over frequency

By the time MUM rolled out in 2021, raw density had stopped predicting rank in any consistent way. Per Ahrefs, correlation studies between exact-match keyword density and ranking position have hovered near zero since 2016.

The metric stuck around because it is easy to measure. Marketers love a number. Density is a number. That alone keeps it on most on-page checklists, even though the predictive power is gone.

What Google looks at instead in 2026 (entities, semantic relevance, helpful content)

Google's ranking systems read meaning, not term frequency. Three signals replaced density as the dominant on-page levers.

Entities and the knowledge graph. Google parses the entities a page covers, not just the keywords. A page about "running shoes" should also mention pronation, midsole foam, drop, and brand names. The knowledge graph rewards topical coverage, not repetition.

Semantic relevance and search intent. BERT and MUM evaluate whether the page actually answers the query. A 1,500-word piece that uses the target keyword once and matches search intent outranks a 600-word piece with 12 forced mentions. Per Google Search Central, content should be written for people first, not for crawlers.

The helpful content system and E-E-A-T. Google's helpful content classifier and the E-E-A-T framework reward original value. First-hand experience, named authors, primary citations, and a clear "last reviewed" date now carry more weight than any lexical metric. On-page SEO work in 2026 is closer to journalism than to keyword math.

The shift is permanent. Optimizing density above 2 percent in 2026 risks demotion. Optimizing topical coverage and clarity earns rankings.

Avoiding keyword stuffing

Keyword stuffing is repeating a term so often it disrupts readability or signals manipulation to Google. It is a confirmed spam policy violation.

Google's spam policies list four common stuffing patterns:

  • Lists of cities, product variants, or phone numbers with no surrounding context.
  • Blocks of text that repeat the same word or phrase to the point of nonsense.
  • Hidden text in white-on-white CSS, off-screen positioning, or zero-pixel fonts.
  • "Doorway" pages built around tiny variations of one query.

Stuffing triggers two response paths. The SpamBrain algorithmic system demotes the page silently. A manual action from a Google reviewer can demote the entire domain. Recovery from a manual action takes weeks of cleanup and a reconsideration request.

Three habits keep density honest:

  1. Write the page for the reader. Edit for SEO once.
  2. Use semantic variants. "Keyword frequency," "keyword usage," "term repetition" all serve the same topic.
  3. Run the page through a density checker before publishing. Anything above 2.5 percent gets a rewrite.

A page that reads naturally to a human almost never trips the stuffing threshold.

Real-world example: a content audit with numbers

A direct-to-consumer mattress brand audits its 40 highest-traffic blog posts. The trigger: organic traffic flat for six months, despite weekly publishing.

The audit pulls keyword density across all 40 URLs. Three buckets emerge:

  • 9 pages above 3 percent density on the primary keyword. Average position: 27.
  • 24 pages between 0.5 and 2 percent. Average position: 11.
  • 7 pages below 0.4 percent. Average position: 19.

The team rewrites the 9 high-density pages. They cut redundant repetition, replace exact-match phrases with semantic variants, and add two original data tables per post. Average density after rewrite: 1.2 percent. Total word count goes up 22 percent.

Twelve weeks later, the same 9 pages move from average position 27 to position 9. Organic clicks across the cluster climb from 4,100 to 11,800 per month. No new backlinks. No new pages.

The lesson holds. Density above the band hurts. Density inside the band stops mattering once the content is genuinely helpful. The lift came from depth, not from a magic ratio.

Keyword density tools

Most on-page audits surface density as a side metric. Three categories cover the workflow.

Free density checkers.

On-page editors with density built in.

  • Yoast SEO. WordPress plugin. Flags density outside 0.5 to 2.5 percent and suggests semantic variants.
  • Surfer SEO. Cloud editor. Compares the draft against the top 20 ranking pages on the same query and recommends a target frequency band.

Site-wide audit crawlers.

  • Screaming Frog. Pulls word count and primary keyword frequency for every crawled URL. Free up to 500 URLs.
  • Ahrefs Site Audit. Surfaces over-optimized pages alongside other on-page issues.

The right answer for most teams is one editor, one crawler. Yoast or Surfer in the writing seat. Screaming Frog or Ahrefs for the quarterly check. Density sits inside a broader keyword research and on-page workflow, not as a standalone metric.

Related terms

Frequently asked questions

What is a good keyword density in 2026?

There is no official target. Most well-ranking pages land between 0.5 and 2 percent for the primary keyword, calculated as occurrences divided by total words. Google has not published a recommended density. Per Google Search Central, repeating a keyword to manipulate ranking violates spam policy.

Does keyword density still affect Google rankings?

Barely. Google uses semantic models like BERT and MUM to read meaning, not raw frequency. A page with one mention of the target keyword and rich topical coverage often outranks a page with 30 mentions and thin context. Density still acts as a guardrail against keyword stuffing, not a positive signal.

How do you calculate keyword density?

Divide the number of times the keyword appears by the total word count, then multiply by 100. A 1,000-word page with 8 mentions of the target term has 0.8 percent density. Most SEO tools, including Yoast and Surfer, compute this automatically and flag values above roughly 2.5 percent.

What is keyword stuffing?

Keyword stuffing is repeating a term unnaturally to game rankings. Examples include hidden text, lists of cities or product variants, and paragraphs that read like a thesaurus exercise. Google's spam policies treat stuffing as a manual action trigger and a target for the SpamBrain algorithmic system.

Should I use an exact-match keyword in every heading?

No. Use the primary keyword in the title, the H1, and one or two H2s where it reads naturally. Force it into every heading and the page reads like SEO spam. Moz recommends natural language and semantic variants over exact-match repetition in 2026.

Stop defining. Start launching.

Turn Keyword Density into live campaigns.

Coinis AI Marketing Platform builds ad creatives. Launches to Meta. Tracks ROAS. Free to try. No credit card.

  • AI image and video ads from any product link.
  • One-click launch to Meta Ads.
  • Real-time ROAS tracking.