Vanity Metrics: Examples & Actionable Replacements
Vanity metrics are the numbers that look great in a board deck and rarely change a single decision. Followers, page views, total signups, app downloads, hours logged, calls made: all classic examples. They move easily, they go up over time even when the underlying business is flat, and they tend to dominate dashboards precisely because they make everyone feel good.
This guide goes deep on what counts as a vanity metric and why teams keep tracking them anyway. It includes the actionable replacements by channel and the cases where a "vanity" number is actually doing useful work. For the broader framework around what qualifies as a real KPI, see the KPI framework guide; this is the deep dive on the most common failure mode.
Quick Answer: What Is a Vanity Metric?
A vanity metric is a number that looks meaningful but does not drive a decision. The term was coined by Eric Ries in The Lean Startup (2011). He used it to describe metrics that "make us feel good but offer no clear guidance for what to do." The classic test: if the metric improved 50% next quarter, would the business demonstrably grow? If the answer is "not necessarily," the metric is vanity.
"Vanity metrics... numbers that make us feel good but offer no clear guidance for what to do." - Eric Ries, The Lean Startup (2011)
Real KPIs answer "what should we do next?" Vanity metrics answer "are we still growing?" The first runs the business; the second decorates the dashboard. The classifier widget in our KPI framework guide tests this directly: paste in your metric and the four-check rubric returns a verdict.
Why Teams Keep Tracking Them Anyway
If vanity metrics are so well-known, why do they keep ending up on dashboards? The answer is mostly psychological and political, not analytical.
They are easy to gather. Follower counts, page views, and impressions come free with the platform. Real KPIs (cohort retention, conversion by source, gross margin per project) require setting up the measurement and choosing what counts. Easy beats useful in most reporting cycles.
They are emotionally safe. A vanity metric that goes up tells the team they are succeeding without testing whether they actually are. A real KPI can go down, which forces a hard conversation. Teams that are tired or under pressure tend to gravitate to metrics that do not threaten their narrative of progress.
They impress executives and investors. "We hit 100,000 users" is easier to sell upstairs than "monthly retention dropped from 38% to 34%." A board member who sees a hockey-stick chart on followers feels reassured even when nothing meaningful is happening underneath. The pressure flows downward. Teams track the vanity number because that is what the boss wants to see, not because it answers a real question.
They confuse correlation with causation. Vanity metrics correlate with real outcomes during good periods, which is why teams keep them. Followers and revenue both grew last year, so followers must matter. The link breaks during stress: a competitor launches, the algorithm changes, and followers stay flat while revenue collapses. By then it is too late to switch.
Knowing why vanity metrics persist is half the work. The other half is replacing them.
Vanity Metrics by Channel: Swap This for That
The fastest way to clean up a dashboard is to walk it channel by channel and swap the vanity number for the actionable replacement. The table below shows the swaps we see most often across teams, including the agency angle most public lists skip.
| Channel | Vanity Metric | Actionable Replacement |
|---|---|---|
| Social media | Followers, likes, impressions | Engaged followers who clicked through and converted; reply-to-impression ratio on key posts |
| Content / SEO | Page views, total traffic, time on page | Page views from organic search that produced an MQL; conversion rate of top-3 landing pages |
| Open rate, total subscribers | Click-through to revenue-driving page; signup-to-paid conversion within 30 days | |
| Paid ads | Impressions, total ad spend, clicks | Cost per acquired customer; return on ad spend (ROAS); LTV-to-CAC ratio |
| Product / SaaS | Total signups, app downloads, MAU | Activation rate (users who hit "aha" milestone); week-4 retention; product-qualified leads |
| Sales | Calls made, demos booked, leads in CRM | Win rate by segment; pipeline coverage to quota; average deal size by channel |
| Customer support | Tickets closed, agent volume | First-contact resolution rate; CSAT after resolution; ticket reopens within 7 days |
| Agency / services | Total clients, hours logged, projects in flight | Project gross margin; billable utilization; net revenue retention; client NPS |
The pattern is consistent: vanity metrics measure exposure or activity, actionable metrics measure outcome. Likes become engaged-followers-who-converted. Page views become MQLs from organic search. Calls made become win rate by segment. Each swap forces the question "what is this work supposed to produce?" and tracks the answer instead of the activity.
"The single metric that best captures the core value that your product delivers to customers and is the key to driving sustainable growth." - Sean Ellis, on the North Star Metric
Sean Ellis's framing of the North Star Metric is the cleanest replacement test. Pick the one number that, if it kept rising, would mean the business is genuinely working. At Airbnb the answer is nights booked. At Facebook it was daily active users. At an agency, it might be project gross margin or net revenue retention. Whatever it is, the surrounding metrics either feed it or are vanity.
How to Spot a Vanity Metric in 30 Seconds
You do not need a long audit to identify vanity. Three quick tests usually do it.
The 50% test. Imagine the metric improved 50% next quarter. Would the business definitely be in better shape, or could it improve while revenue, retention, and margin all stayed flat? If the answer is "could go either way," it is vanity.
The next-step test. If the number drops 30% next month, does the team know what to do? A real KPI has a clear playbook attached. A vanity metric leaves people shrugging or scrambling for spin.
The aggregate-vs-cohort test. Most vanity metrics are aggregates that hide what is happening to specific groups. "10,000 active users" sounds healthy until you split it: 9,000 are last month's free trials cooling off, 1,000 are paying. The cohort view exposes the truth; the aggregate hides it.
Run any candidate metric through those three tests before adding it to a dashboard. Most candidate metrics fail at least one.
The reason these tests work is they expose the gap between what a metric describes and what the team actually controls. Vanity metrics describe a state; actionable metrics describe an outcome the team is responsible for. Page views describe a state. MQLs from organic search describe an outcome marketing owns. Demos booked describe a state. Win rate by segment describes an outcome sales owns. The shift in language is small but the shift in accountability is large, which is exactly why the cleanup is uncomfortable.
When Vanity Metrics Are Actually Useful
The argument so far has been one-sided. The honest counter is that vanity metrics earn their place in two specific situations, and pretending otherwise reads as preachy.
"Vanity metrics aren't the ultimate measure of your success... at the beginning of a new product, process, or activity they do provide insight." - Jeff Gothelf, In Defense of Vanity Metrics
Early-stage signal. When you launch something new, you do not have conversion or retention data yet because no one has had time to convert or churn. Page views, signups, demo bookings, and downloads tell you whether the offer is even resonating. Once you have a real cohort to measure, those numbers should drop off the dashboard.
Brand-awareness phases. Some campaigns are explicitly about being seen, not converting this quarter. PR pushes, conference launches, and category-creation efforts use reach metrics (impressions, mentions, share of voice) as legitimate KPIs because awareness is the outcome. The trap is letting "awareness" stay on the board after the campaign ends.
The shared rule: a vanity metric is appropriate when no actionable alternative exists yet, and only until one does. The moment you have real data on what those impressions or signups produce, the vanity number gets retired.
Common Mistakes
The patterns below show up across teams that intend to do better on metrics and slowly drift back to vanity. Most of them come from social pressure rather than analytical confusion.
- Adding a vanity metric "just for the report" A metric on the dashboard "because the board likes to see it" is a problem the team will pay for later. It crowds out attention from the metrics that matter and trains everyone to expect feel-good numbers. Either the metric drives a decision or it gets cut.
- Confusing engagement metrics with conversion Likes, comments, time on page, and shares are engagement; they describe how people interact with content. Conversion describes whether that interaction produced an outcome the business wanted. Most "engagement KPIs" are vanity until they are paired with the conversion metric they are supposed to predict.
- Tracking growth without a denominator "We grew 40% this month" sounds great until you remember the base was 10. Always pair growth percentages with the absolute number, the cohort size, and the baseline before declaring victory. A growth rate without a denominator is the most common vanity dressed in respectable language.
- Defending a vanity metric with "it correlates sometimes" Most vanity metrics correlate weakly with real outcomes during good periods, which is why teams keep them. The test is whether the team would change behavior if the metric moved. If a 30% drop in followers next quarter would not change a single decision, the metric is not predictive enough to track.
- Replacing one vanity metric with another Swapping monthly active users for daily active users is a smaller vanity metric, not a real KPI. Both still tell you "people opened the app." A real replacement is something like "weekly users who completed the core action" (a billable transaction, a saved file, a sent message), not a smaller version of the same volume metric.
- Letting compensation ride on a vanity metric Tying bonuses, OKRs, or performance reviews to vanity metrics is the fastest way to corrupt the team's behavior. People will optimize for what is rewarded; if you reward followers, you get followers, often at the expense of the underlying business. Reward the actionable replacement, not the headline number.
The mistake that does the most damage is letting compensation ride on a vanity metric. As soon as a bonus depends on follower growth or signup volume, the team will manufacture follower growth or signup volume, often at the cost of the underlying business. Tie compensation to the actionable replacement and the dashboard cleans itself up.
What We Recommend
At Rock we run an annual exercise we call the vanity sweep. Every team takes its current dashboard and runs each metric through the 50% test, the next-step test, and the cohort test. Anything that fails gets cut, or demoted to a "context" panel that nobody is graded on. The point of the sweep is not deletion. It is redirecting attention.
The cleanup creates room for two or three actionable KPIs the team will actually try to move that quarter. From there, every task on the board has to connect to one of those metrics. Activity that does not link to a real KPI is either work that should be killed or work the team has not learned to measure yet. The board below shows what that looks like for one quarter: two KPIs picked, six real tasks tagged by which metric each task moves.
Two KPIs, Six Real Tasks
Two KPIs the team is moving this quarter: MQLs from organic search (blue) and Project gross margin (green). Every task on the board moves one of them. Drag to Done as the team ships, or add your own.
Drag tasks between columns or add your own
Tap a task, then tap a column header
The shape of that board is the deliverable. Two KPIs the team has agreed are real. Six (or more) activities that connect to one of those KPIs by name. No task on the board is housekeeping or "we should track this." Every card is work that, when shipped, will move one of the two numbers the team has committed to. That is the whole game once vanity is cleared away.
Pair this with the broader strategy stack and the measurement layer becomes coherent. SWOT, Strategic Choice Cascade, and PESTEL set the strategic direction. The OKR framework drives the change you are committing to this quarter. The KPI framework defines the standards you hold day to day. The OKR vs KPI guide covers the operational handoff. For function-specific application, see marketing KPIs, sales KPIs, and agency KPIs; the billable hours guide covers the operational input below all of them. This article is the discipline that keeps any of those measurements honest.
Run the vanity sweep with your team. Rock combines chat, tasks, and notes in one workspace. One flat price, unlimited users. Get started for free.








