SEO Metrics: Every Metric Explained (and When to Care)

A comprehensive reference guide to every SEO metric — what it measures, why it matters, and when to act on it. Organized by category: visibility, traffic, engagement, technical, conversions, and backlinks.

Climer TeamMarch 5, 202614 min read

There are more than 200 factors Google is believed to consider when ranking pages. The number of metrics you can track in your analytics stack easily reaches the hundreds. Most teams track far too many and act on far too few.

This guide covers every major SEO metric — organized by category, with a clear explanation of what each one measures, why it matters, and when you should actually pay attention to it. Use it as a reference, not a checklist: not every metric belongs in your reporting stack.


What makes an SEO metric worth tracking#

Before the list, the filter: a metric is worth tracking if it changes in response to something you do, and if it tells you what to do differently when it moves in the wrong direction.

Metrics that fail this test — impressions without CTR context, domain authority scores, total indexed page counts — are background noise. They may be interesting, but they don't drive decisions.

The useful SEO metrics fall into six categories: visibility, traffic, engagement, technical health, conversions, and backlinks. A complete metrics stack covers all six.


Visibility metrics#

Impressions — how many times your pages appeared in Google search results for any query. Impressions measure reach, not results. A page can earn 10,000 impressions with 50 clicks if the query is irrelevant to what the page offers, or if the ranking position is too low to generate meaningful clicks. Track impressions alongside CTR; on their own they're an incomplete signal.

Average position — the mean SERP ranking for a page or keyword, reported by Google Search Console. Useful as a trend metric, not as an absolute. A page averaging position 7.3 that moves to 5.1 is a meaningful improvement worth investigating. A page averaging position 7.3 for six months is signaling that it's stuck near the bottom of page 1 — often the highest-leverage optimization opportunity in your content library.

Keyword rankings — the tracked search positions for specific target keywords across devices and locations. This is the primary leading indicator in SEO: ranking movements precede traffic changes by roughly two to six weeks. Track rankings at three levels: your target keywords, position movers (keywords that gained or lost three or more positions since last period), and page-1 boundary keywords sitting at positions 11–20.

Share of voice — your percentage of total possible organic clicks for a defined keyword set, compared to competitors. SOV requires a rank tracker, not just GSC, and a defined competitor set. It's the only metric that tells you whether you're winning or losing relative to the competition — not just whether your absolute numbers are growing.

SERP feature ownership — whether your pages appear in featured snippets, image packs, video carousels, People Also Ask boxes, or local packs for target queries. Earning a featured snippet for a query you're already ranking for increases CTR meaningfully. Losing a featured snippet you previously held will show as a position and CTR drop even if your underlying ranking hasn't changed.

AI Overview citation rate — whether your content is cited as a source in Google's AI-generated answer boxes. This is the newest visibility metric and increasingly consequential. Research from Seer Interactive found that organic CTR dropped 61% for queries that triggered AI Overviews — but sites whose content appeared as cited sources reversed that decline. Tracking AI citation rate requires either manual monitoring or a tool like Climer that queries AI systems on your behalf.


Traffic metrics#

Organic sessions — the total volume of visits arriving from organic search. This is the headline traffic metric. Track it month-over-month and year-over-year, and always segment branded versus non-branded: branded traffic growth reflects awareness of your company name, while non-branded growth reflects new audience acquisition.

Organic click-through rate (CTR) — clicks divided by impressions, from Google Search Console. CTR tells you how compelling your title and meta description are relative to the competition. Benchmarks vary by position and query type, but a useful rule: if your page ranks in the top three but earns below 10% CTR, the meta title or description is the problem — not the ranking.

AI Overviews have made average CTR benchmarks less reliable as universal standards. For queries that trigger AI Overviews, the CTR curve has shifted downward significantly across all positions. The right CTR benchmark is your own baseline by query type: compare this month's CTR to last month's for the same query set.

Branded vs. non-branded traffic split — the ratio of sessions from people searching your company or product name versus people searching generic terms. Most analytics setups require a custom filter in GA4 or Search Console to separate these accurately. For early-stage companies, non-branded growth is the more important signal — it means you're acquiring audiences who didn't know you existed.

New vs. returning organic visitors — a GA4 metric that indicates whether SEO is expanding your audience or repeatedly sending the same people back to your site. For content-heavy sites, a growing proportion of new visitors is a health signal. For SaaS companies, understanding how returning organic visitors behave (and whether they convert at higher rates) matters for attribution.

Organic traffic by page — which specific pages drive the most organic traffic, from Search Console. Tracking top performers month-over-month surfaces both wins (new content building traction) and problems (previously strong pages starting to decline). A page that earns zero impressions four weeks after publishing may have an indexing issue; one that earns impressions but no clicks has a title or description problem.


Engagement metrics#

Engagement rate — the percentage of sessions classified as "engaged" by GA4. An engaged session lasts longer than 10 seconds, has a conversion event, or has two or more page views. Engagement rate replaced bounce rate as Google Analytics 4's primary engagement metric. Industry benchmarks average around 56%, but the more useful comparison is across your own content: pages with significantly below-average engagement rates often indicate content-intent mismatch.

Average engagement time — the mean time users actively interact with your site during a session, from GA4. This replaces the old "time on page" metric, which was unreliable because it couldn't measure the last page of a session. Average engagement time is a better proxy for content quality than session duration.

Scroll depth — how far down a page users scroll before leaving. Typically tracked as a percentage of page height (25%, 50%, 75%, 90%). For long-form content, low scroll depth combined with low engagement time suggests the content isn't holding attention past the introduction — the page may be answering the query in the first paragraph and losing readers before reaching the conversion CTA.

Pages per session from organic — how many pages organic visitors view during a single visit. Higher numbers indicate strong internal linking and content relevance. Very high pages per session (6+) combined with long session time usually signals engaged readers; very high pages per session combined with low engagement time can indicate confusing navigation.


Prove Your SEO Results Automatically

Climer tracks your rankings, traffic, and AI visibility — and generates reports your clients will actually read.

Technical health metrics#

Core Web Vitals — Google's three primary site experience signals, measured against real user data from the Chrome User Experience Report (CrUX):

MetricMeasuresGoodNeeds ImprovementPoor
LCP (Largest Contentful Paint)Loading speed< 2.5s2.5–4.0s> 4.0s
INP (Interaction to Next Paint)Interactivity< 200ms200–500ms> 500ms
CLS (Cumulative Layout Shift)Visual stability< 0.10.1–0.25> 0.25

Google requires 75% of your visitors to score "Good" on all three metrics in CrUX data for a page to be considered passing. As of 2025, only 48% of mobile pages and 56% of desktop pages pass all three — LCP is the most common failure point, with only 62% of pages scoring Good on that metric alone.

Pages with Core Web Vitals in the "Poor" band face a measurable ranking disadvantage relative to comparable pages with Good scores. Use Search Console's Core Web Vitals report (which uses field data) rather than Lighthouse scores (which are lab data) for the metric Google actually uses.

Crawl coverage — the ratio of submitted pages that Google has crawled and indexed. A significant gap between submitted and indexed pages typically points to duplicate content, thin content being filtered out, or sitemap misconfiguration. Track crawl coverage trend after deployments — a noindex tag accidentally applied to a category or product page shows up as a sudden drop here.

Crawl errors — 404s, redirect chains, server errors, and soft 404s surfaced in Search Console's Index Coverage report. These are better monitored via alerts than monthly reviews. A 404 spike typically traces to a specific deployment or URL structure change; a server error spike may indicate a hosting or caching problem.

Index ratio — pages indexed divided by total pages submitted. An index ratio significantly below 1.0 isn't automatically a problem — not every page on your site should be indexed. The diagnosis depends on which pages are excluded: intentional noindex on duplicate or parameter pages is healthy; valuable content pages being excluded by Google is a problem worth investigating.

Page speed by real user data — LCP and INP scores from CrUX (real user data), not Lighthouse (lab data). Lighthouse scores are useful for debugging; CrUX data is what affects rankings. A page that scores 95 in Lighthouse but serves a low-bandwidth audience on slow mobile connections may still fail Core Web Vitals.

Redirect chains — the number of hops between the original URL and the final destination. Two-hop redirects (A → B → C) bleed link equity and slow page load times. Audit redirect chains during technical reviews and collapse multi-hop chains to single redirects.


Conversion metrics#

Organic conversions — goal completions (sign-ups, purchases, demo requests, form submissions) attributed to organic search traffic in GA4. This is the metric that closes the loop between SEO activity and business outcomes. Without it, you're reporting on activity rather than results. Organic conversions require conversion event setup in GA4 — they don't appear automatically.

Organic conversion rate — organic conversions divided by organic sessions. Benchmarks vary significantly by industry and conversion type. The more useful comparison: your organic conversion rate versus your paid conversion rate and your overall site conversion rate. A much lower organic conversion rate than other channels may indicate a mismatch between your organic audience's intent and your site's offer.

Organic-attributed revenue — for e-commerce and SaaS with tracked purchases or trial conversions, the total revenue attributable to organic search. The attribution model used significantly affects this number. Last-touch attribution undervalues organic search because SEO content often initiates journeys that close through direct or paid channels. GA4's data-driven attribution gives a more accurate picture.

Organic-influenced pipeline — for B2B teams with CRM connections to analytics, the value of sales opportunities where organic search was an initial or key touchpoint. This metric requires more setup than basic conversion tracking, but it's what revenue leaders and CFOs respond to when evaluating SEO budget.

Cost per organic conversion — total SEO spend divided by organic conversions for a period. This enables direct comparison with paid channel CPCs and CPAs. Research from First Page Sage found SEO-sourced leads average $31 per conversion — significantly below paid channel averages for most B2B categories.


Referring domains — the number of unique domains linking to your site. This is the primary external authority signal for Google: a single domain can link to you hundreds of times, but each unique domain counts once toward your link profile. Track referring domain count as a trend; steady growth from relevant, authoritative sources is the healthy pattern.

Domain Rating (DR) / Domain Authority (DA) — aggregate authority scores from Ahrefs (DR) and Moz (DA) respectively. These are useful for competitive benchmarking — comparing your link profile strength against competitors ranking for your target keywords — but unreliable as absolute targets. A DR 40 site can outrank a DR 70 site with stronger topical authority and better content relevance.

New vs. lost backlinks — the net change in your link profile over a period. Monitoring for link losses from high-authority domains is as important as tracking new acquisition. A significant drop in DR or DA without a technical explanation often means high-authority inbound links were removed or the linking page was deleted.

Anchor text distribution — the breakdown of anchor text used by sites linking to you. An anchor text profile that's heavily weighted toward exact-match keywords is a risk signal — it can look manipulative to Google. A healthy profile includes branded anchors, partial-match anchors, URL anchors, and natural phrase anchors.

Link velocity — the rate at which new referring domains are added over time. Sudden spikes in link acquisition can trigger algorithmic scrutiny. Steady, consistent link growth is preferable to periodic spikes from link-building campaigns, though the most important factor is link quality, not velocity.


AI visibility metrics#

AI visibility is the newest category in SEO measurement, and it's becoming impossible to ignore as more search queries are answered by AI systems rather than traditional blue-link results.

AI citation count — how many times your content is cited as a source in ChatGPT, Perplexity, Claude, Gemini, and Google AI Overviews for target queries. This is directional, not precise — AI model responses vary by query phrasing and are probabilistic. Consistent citation tracking requires running the same queries across AI platforms periodically.

AI Overview appearance rate — the percentage of your target keywords that trigger Google AI Overviews in which your content is cited. This is a more tractable measurement than full AI citation tracking because it's limited to Google Search Console queries and can be cross-referenced with your existing GSC data.

Brand mention sentiment in AI responses — whether AI systems describe your product accurately and positively when users ask about your category. This is more qualitative than quantitative, but it matters: if ChatGPT describes a competitor's tool as "the industry standard" when answering a query about your product category, you have an AI visibility problem regardless of your traditional rankings.

Climer's AI Radar feature tracks brand and competitor citations across AI platforms over time, which turns AI visibility from a spot-check into a trackable trend.


Metrics to stop tracking#

Raw impressions without CTR context. Impressions can grow while clicks stay flat. They're a diagnostic, not a KPI.

Domain Authority as a primary goal. DA and DR are third-party calculations, not Google signals. Teams that optimize for DA end up acquiring low-quality links that boost the score without improving rankings.

Total indexed pages. More indexed pages is not inherently good. Thin, duplicate, or low-quality pages in the index is a quality liability.

Social shares. Social shares don't drive rankings. They're a content distribution metric.

Lighthouse performance scores as field data proxies. Lighthouse scores are lab-based. Core Web Vitals in Search Console use real user data from CrUX, which is what Google uses for ranking signals. A high Lighthouse score that doesn't translate to CrUX "Good" status is a false comfort.


Building a metrics stack#

The minimum viable SEO metrics stack that covers all six categories:

Google Search Console (free) — visibility metrics (impressions, positions, CTR), crawl and indexation data, Core Web Vitals from real user data, and structured data errors.

Google Analytics 4 (free) — traffic metrics (organic sessions, branded/non-branded split), engagement metrics (engagement rate, average engagement time), and conversion tracking.

Rank tracker — keyword position tracking for your target terms beyond what GSC provides. Options range from SE Ranking and Mangools at the affordable end to Ahrefs and SEMrush for full competitive analysis.

Backlink monitor — Ahrefs, Moz, or SEMrush for referring domain tracking, DR/DA, and link acquisition/loss monitoring.

AI visibility tool — for teams that take AI search seriously. Climer's workspace dashboard consolidates keyword rankings, traffic trends, content performance, and AI citation tracking in one place — eliminating the context-switching between separate platforms that makes comprehensive metric reviews tedious.


Ready to grow your organic traffic?

Climer handles keyword research, content creation, and performance tracking — so you can focus on running your business. No credit card required.

Get started free

Related Articles