Game Design KPI Coverage Audit
Check whether the evaluation framework is seeing the whole value of the feature, or only the parts that are easy to measure.
Use this skill when a feature is being judged mainly through directly attributable KPIs and you suspect that measurement logic is biasing the team toward flashy, self-contained systems while undervaluing connective tissue, UX, quality-of-life, enabling systems, or long-term structural work.
Read references/value-types.md when identifying what kind of value the feature creates.
Read references/blind-spot-patterns.md when diagnosing common KPI-coverage failures.
Read references/recommendation-patterns.md when deciding how to justify or evaluate hard-to-measure work.
What to produce
Produce:
- Feature read - what the proposal is and what role it plays
- Current KPI framing - what the team is measuring or expecting to measure
- Coverage diagnosis - what value is covered versus ignored by those KPIs
- Blind spots - what may be neglected because it is hard to measure directly
- Risk of mis-prioritization - what bad decisions may result from the current framing
- Evaluation recommendation - how the feature should be justified, monitored, or compared more fairly
Process
1. Identify the role of the feature
Clarify whether the proposal is mainly:
- a standalone engagement feature
- a monetization feature
- a progression layer
- a UX improvement
- connective tissue between systems
- quality-of-life work
- support infrastructure for future features
- a clarity, pacing, or usability improvement
2. Identify the current KPI story
Ask:
- what metric is the team using to justify this feature?
- is it tied to revenue, retention, engagement, conversion, economy balance, sentiment, or something else?
- is the KPI direct, indirect, speculative, or absent?
3. Audit KPI coverage
Check whether the metric framing captures the actual value of the work.
Look for value types such as:
- direct monetization
- direct engagement lift
- retention support
- reduced friction
- improved comprehension
- stronger connective tissue between systems
- future feature enablement
- long-term sustainability
- reduced support burden or balancing burden
4. Identify blind spots
Common signs:
- the feature is dismissed because it cannot move one headline KPI on its own
- direct-revenue features are always favored over structural health
- UX work is treated as optional because it lacks clean attribution
- foundational work is postponed until crisis
- enabling systems are undervalued because they mostly improve the performance of other features
5. Judge prioritization risk
Ask:
- what happens if this feature is judged only by direct KPI lift?
- is the team likely to underinvest in maintenance, UX, clarity, infrastructure, or connective tissue?
- could the current framework systematically reward short-term visible wins over long-term health?
6. Recommend better evaluation
Possible moves:
- use a mixed scorecard instead of one KPI
- classify the feature as enabling or connective work rather than forcing a fake direct KPI
- evaluate via downstream support of other systems
- allocate protected capacity for high-value, hard-to-measure work
- compare opportunity cost honestly rather than pretending everything must tie to one direct metric
Response structure
Feature Read
Current KPI Framing
Coverage Diagnosis
Blind Spots
Risk of Mis-Prioritization
Recommendation
Fast mode
- What is this feature actually for?
- What KPI is being used to justify it?
- What important value is not being captured by that KPI?
- What bad prioritization decision could this cause?
- How should the team evaluate it more fairly?
Style rules
- Do not dismiss KPIs; diagnose their limits.
- Do not invent fake measurable certainty for support work.
- Distinguish direct value from enabling value.
- Prefer fairer framing over anti-metrics rhetoric.
- Be specific about how blind spots distort roadmap decisions.
Working principle
Teams often prioritize what they can measure cleanly, not what matters most.
Use this skill to expose where KPI logic is too narrow for the actual design value on the table.