KPI Framework: Examples, Types & How to Choose Yours

Rock

>

Blog

>

Future of Work

>

KPIs are the most common performance-management tool in modern teams, and the most commonly misused. Most metrics teams call KPIs are not actually KPIs at all. They are result indicators, vanity metrics, or process measures dressed up in performance-management language.

This guide explains what genuinely qualifies as a KPI and how to choose a set that actually drives decisions. It includes examples by function (including agencies) and the mistakes that turn KPI dashboards into wallpaper. Use the classifier below to test whether the metric you have in mind is a real KPI before you build a scorecard for it.

Is It a KPI or a Vanity Metric?

Type a metric you are considering, then check the boxes that apply. The widget classifies it as a real KPI, a vanity metric, or a process measure, and gives you a starting scorecard if it qualifies.

0 of 4 checks selected
Looks like a real KPI. Drop the scorecard into your team workspace and assign an owner.
Try Rock free →×

Quick Answer: What Is a KPI?

A Key Performance Indicator (KPI) is a quantitative measure of performance against a specific business outcome. A real KPI has four properties. It is tied to a business outcome (revenue, retention, quality, cost). It is actionable (a deviation triggers a clear next step). It is measurable continuously (daily or weekly, not annually). And it has a single named owner. KPIs are most useful when capped at five to seven per team and reviewed on a fixed cadence.

The framework was popularized through Kaplan and Norton's Balanced Scorecard in the early 1990s and refined by David Parmenter into the modern "winning KPIs" methodology. Both authorities agree on the same point: a small number of well-chosen KPIs beats a 30-tile dashboard every time.

"What you measure is what you get." - Robert S. Kaplan and David P. Norton, The Balanced Scorecard, Harvard Business Review (1992)

KPI vs Metric: What's Actually a KPI?

Every KPI is a metric, but not every metric is a KPI. A metric is any quantitative measurement (page views, hours billed, ticket count). A KPI is a metric that is explicitly tied to a strategic outcome and used to drive decisions. The distinction matters because tracking everything as a "KPI" dilutes attention away from the metrics that actually move the business.

"An organization operating without its critical success factors, known by all staff, is aimless." - David Parmenter, Key Performance Indicators (4th ed., Wiley)

Parmenter's central insight is that KPIs flow from critical success factors. If the team cannot articulate what it must do well to win in its market, no amount of measurement will fix the problem. The work is upstream: identify the two or three things this team must execute on, then pick the metrics that prove those things are happening. KPIs without that grounding become vanity dressed in dashboards.

Types of KPIs

KPIs come in several overlapping categories. Knowing which category a given KPI sits in helps you decide how often to review it, who should own it, and what kind of action a deviation should trigger.

Type What it tracks Example
Leading Predicts future performance; can be acted on early to change the outcome Number of qualified opportunities in pipeline this week
Lagging Confirms what already happened; reflects past results, harder to influence Closed-won revenue last month
Input Resources put into a process (time, budget, people, raw materials) Hours billed per consultant this week
Process Activity executed during the work itself Average time to respond to support ticket
Output What the process produces (volume or quality) Number of features shipped this sprint
Outcome Impact in the world (the result you actually care about) Net revenue retention; customer satisfaction
Strategic Top-level: tracks progress against organizational goals (executive view) Annual revenue; market share; gross margin
Operational Day-to-day: tracks process health for a function or team Cost per acquisition; first-contact resolution rate

The most useful distinction in practice is leading vs lagging. Leading indicators (pipeline coverage, ticket queue depth, response time) move first; the team can act on them this week. Lagging indicators (closed revenue, churn, gross margin) confirm what already happened and are harder to influence after the fact. Healthy KPI sets mix both: leading metrics for daily action, lagging metrics for monthly accountability.

How to Choose KPIs That Drive Decisions

The hardest part of working with KPIs is not building dashboards. It is deciding which five to seven metrics deserve the team's attention. The process below is the one we use, refined across teams that have ended up with bloated 30-metric dashboards and worked their way back to a useful set.

  1. Start with the outcome the team is responsible for A KPI exists to track an outcome the team owns. Skip "what is easy to measure" and ask "what would make this team's work clearly successful?" Revenue, retention, gross margin, response time, quality scores all qualify. Followers, page views, hours spent typically do not.
  2. Pick the metric, not the activity For each outcome, pick one quantitative measure. "Average response time" not "we will respond faster." Numbers can be percentages, ratios, dollar values, time durations, or NPS scores. They cannot be feelings, alignment, or "improved."
  3. Set a target band, not just a target A KPI needs both a target (where we want it to be) and a threshold (what triggers attention). "Project gross margin above 35%" is a target; "alert if any project drops under 30%" is the threshold. Without the threshold, the metric becomes wallpaper.
  4. Assign one owner, not a committee Each KPI needs a single named owner whose phone goes off when the metric leaves its band. Shared ownership across three people usually means none of them owns it on the day it slips. The owner is not the executor; the owner is the person accountable for the trend.
  5. Cap the set and review on a fixed cadence Cap each team at five to seven KPIs. Review weekly for fast-moving metrics (response time, lead flow), monthly for slower ones (margin, retention). Once a quarter, recalibrate: drop the ones the team has stopped acting on, and replace them with metrics that match what the team is actually working on now.

The discipline that makes this work is the willingness to drop metrics. Most teams add KPIs over time and never remove them; the dashboard quietly bloats from 7 to 12 to 25 over a year. Run a quarterly cull: any KPI the team has not acted on in 90 days gets demoted to a process measure or removed entirely.

KPI Examples by Function

Examples make the concept concrete. The table below shows the KPIs we see most often by function, written to the rules above (outcome-focused, measurable continuously, single-owner, with a target band rather than a vague aspiration). Treat them as starting points; the right set for your team depends on what specifically you are responsible for moving this year.

Function Common KPIs
Marketing Marketing-qualified leads (MQLs) per month Cost per acquisition (CPA) by channel Conversion rate, signup to paid Return on ad spend (ROAS)
Sales Pipeline coverage (4x quota target) Average deal size and sales cycle length Win rate by segment Quota attainment per rep
Customer Success Net revenue retention (NRR) above 100% Customer satisfaction (CSAT) above 4.5/5 Net Promoter Score (NPS) Time to first value, under 7 minutes
Product Day-30 retention rate Active users (weekly or monthly) Feature adoption for shipped features Time-to-first-action for new accounts
Engineering PR-to-production cycle time Bug rate per shipped feature Production uptime above 99.9% Mean time to recovery (MTTR)
Agency Project gross margin above 35% Billable utilization 65 to 75% Average response time on client tickets, under 30 minutes Scope creep rate (variance vs original SOW)
Finance Gross profit margin and operating margin Working capital ratio Days sales outstanding (DSO) Cash runway in months

The agency row deserves a closer look because most public KPI lists skip this audience. Service businesses live and die on three numbers: project gross margin, billable utilization, and client retention (often expressed as NRR or NPS). Add a response-time KPI for client communication and a scope-creep rate for delivery discipline, and a ten-person agency has a complete operational dashboard. The temptation is to add another fifteen metrics; the discipline is to leave them off.

Vanity Metrics: KPIs You Should Ignore

The term "vanity metric" was coined by Eric Ries in The Lean Startup. A vanity metric moves easily, looks impressive in reports, and almost never tells the team what to do next. Followers, page views, app downloads, total signups, hours logged, total customer count: these all qualify in most contexts. They go up over time even when nothing is working, and they go down when something temporary changes that is unrelated to the underlying business.

"The only metrics that entrepreneurs should invest energy in collecting are those that help them make decisions." - Eric Ries, The Lean Startup (2011)

The fix is not to track fewer metrics in absolute terms. The fix is to replace each vanity metric with the underlying outcome it should drive. Total signups becomes "signup-to-paid conversion within 30 days." Followers becomes "engaged followers who clicked through and converted." Page views becomes "page views from organic search that produced a marketing-qualified lead." Each replacement turns a wall-decoration metric into a number the team can debate and act on.

Ries's broader argument in The Lean Startup is that the wrong metric is worse than no metric. A vanity number creates the appearance of progress and discourages the harder conversation about whether the underlying business is actually working. The same logic applies inside established companies: a KPI dashboard full of vanity metrics is comforting, but it is also a slow path to surprise.

How to Review and Recalibrate KPIs

A KPI dashboard that is built once and never revisited becomes wallpaper. The cadence that delivers results has three layers, mirroring the rhythm we recommend for OKRs in the OKR vs KPI guide.

Weekly: a 15-minute team scan of the KPI board. Anything outside its band gets a comment from the owner with a planned action. Most weeks, this is a 5-minute conversation.

Monthly: a deeper review of the trend lines. Look for KPIs that are drifting steadily even if they have not crossed the threshold yet. Adjust thresholds if the band no longer reflects realistic performance.

Quarterly: the full recalibration. Drop KPIs the team has not acted on in 90 days. Replace any that no longer match current priorities. Promote earned outcomes from the OKR layer if the new performance level should hold permanently.

Each layer takes proportional time. The weekly scan is fast because most weeks nothing is wrong. The quarterly recalibration is slower because it requires actually deciding what the team is and is not responsible for in the next quarter.

Common Mistakes

The patterns below show up repeatedly across teams that adopt KPIs and lose the value within two quarters. Most of them come from treating KPI tracking as a reporting exercise rather than a decision-making system.

  1. Tracking everything you can measure A 30-tile dashboard is not five times better than a 6-tile dashboard. It is worse, because no one knows where to look. Cap the set at five to seven KPIs per team. The discipline of cutting is what makes the remaining ones matter.
  2. Mistaking vanity metrics for KPIs Followers, page views, app downloads, and total signups all move easily but rarely tell the team what to do next. Replace each one with the underlying outcome it should drive (revenue from those signups, conversion from those visitors, deals from those leads).
  3. No threshold, no action A KPI without a threshold becomes a number on a dashboard nobody opens. Each KPI needs a defined band where the metric is normal and a deviation rule that triggers a specific action. Without that, the team watches the trend without doing anything about it.
  4. Shared ownership across three people When a KPI is "owned by the marketing team" instead of one named lead, no one is accountable on the day it slips. Each KPI needs a single owner whose reputation rides on the trend. The owner is the escalation path, not the executor.
  5. Setting it once and never revisiting KPIs that worked last year are not automatically the right KPIs this year. As the business changes, the set should change with it. Review the full KPI roster every quarter; drop the ones the team has stopped acting on, and replace them with metrics that match current priorities.
  6. Confusing financial result indicators with KPIs David Parmenter's distinction matters: most "KPIs" teams track are actually result indicators (monthly revenue, quarterly margin) measured too rarely to drive daily action. Real KPIs are non-financial, watched daily or weekly, and tied to the team activities that produce the financial outcomes.

The biggest of these, by some margin, is the vanity-metrics trap. If a team's headline KPI moves up steadily for six months while underlying business outcomes do not improve, the metric is wrong, not the business. The classifier widget at the top of this article exists specifically to test this before you commit a metric to the dashboard.

What We Recommend

At Rock we run KPIs on the same workspace pattern as the rest of the strategy stack. Each team space holds a pinned KPI note with four to six metrics, each with target, threshold, owner, and review cadence. The owner posts a one-line update on Mondays for any KPI outside its band. Once a quarter, the full set gets a recalibration review where stale metrics get retired and new ones get added based on what the team is actually working on.

The reason for keeping KPIs in the same workspace as the work is the failure mode we see otherwise. KPI dashboards built in separate BI tools become wallpaper because no one opens them between board meetings. KPI notes pinned next to the team's daily chat and tasks stay visible, get debated, and actually drive action.

For function-specific KPI sets, see agency KPIs, marketing KPIs, and sales KPIs; the operational input layer (billable hours) sits below for service businesses.

Pair this with the broader strategy stack and the KPI layer becomes the operational floor underneath the rest. SWOT covers situation. Strategic Choice Cascade covers integrated choice. PESTEL covers macro context. Porter's Five Forces covers industry structure. OKRs drive the change you are committing to this quarter. KPIs hold the line on the standards you are not willing to give up while you push for change.

Frequently Asked Questions

How many KPIs should we track?

Five to seven per team is the practical cap. Fewer than three and the picture is incomplete; more than seven and no one knows where to look first. The same applies at company level: a healthy executive dashboard tracks five strategic KPIs, not 30.

How often should KPIs be reviewed?

Match the cadence to how fast the metric moves. Weekly review for fast-moving metrics (response time, lead flow, ticket volume). Monthly for slower ones (margin, retention, NPS). Recalibrate the full set quarterly, dropping any KPI the team has stopped acting on.

Can a KPI be qualitative?

Only if the qualitative judgment is converted into a number. NPS scores, CSAT ratings, and quality grades all start as opinions but become KPIs because they are scored on a fixed scale. A pure feeling like "improved customer happiness" is not a KPI; "average CSAT above 4.5/5" is.

Should KPIs be financial or operational?

Most teams need a mix. Financial KPIs (margin, revenue, cost) report results but are typically lagging and measured monthly. Operational KPIs (response time, utilization, defect rate) are leading indicators measured daily or weekly. The operational ones are what the team can actually move; the financial ones tell you whether it worked.

How do I know if a KPI should be dropped?

Two signals. First, the team has not acted on a deviation in the last quarter; the metric has become wallpaper. Second, the underlying outcome the KPI was supposed to track is no longer a priority for the business. Either way, replace it instead of keeping it on the dashboard out of habit.

Do small teams or agencies need KPIs?

Yes, but a smaller set. A 10-person agency can run its operation on three or four KPIs (project gross margin, billable utilization, average response time, client NPS). The framework scales down; what does not scale is tracking 20 metrics with five people who are already running everything.

Track KPIs alongside the work that moves them. Rock combines chat, tasks, and notes in one workspace. One flat price, unlimited users. Get started for free.

Rock workspace with chat tasks and notes
Share this

Rock your work

Get tips and tricks about working with clients, remote work
best practices, and how you can work together more effectively.

Rock brings order to chaos with messaging, tasks,notes, and all your favorite apps in one space.