Metrics Track

v1.1.1

Set up PostHog metrics plan with event funnel, KPI benchmarks, and kill/iterate/scale decision thresholds. Use when user says "set up metrics", "track KPIs",...

0· 599·1 current·1 all-time
byRust@fortunto2
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name and description match the runtime instructions: the skill parses a project PRD/CLAUDE.md, defines a funnel, sets KPIs/thresholds, emits PostHog code snippets, and writes a metrics-plan. Nothing requested (no env vars, no binaries) is unrelated to creating a metrics plan.
Instruction Scope
Instructions explicitly tell the agent to read project documentation (PRD, CLAUDE.md) and to write docs/metrics-plan.md. Reading those files is coherent with the task, but it means the skill will access arbitrary project files — review what files are present (they might contain secrets or unrelated sensitive data) before granting Read access. The skill does not instruct sending data to external endpoints or requiring PostHog credentials; it only generates implementation snippets and docs.
Install Mechanism
No install spec and no code files — instruction-only. This is low risk because nothing is downloaded or written to disk by an installer beyond what the runtime Read/Write tools do (the skill writes the metrics-plan).
Credentials
The skill declares no required environment variables, credentials, or config paths. That is proportionate: it produces guidance and code snippets but does not attempt to connect to PostHog or request API tokens.
Persistence & Privilege
The skill is not always-enabled (always: false) and is user-invocable. It is allowed to read and write project files (Read/Write tools), which fits its purpose of generating a metrics-plan. Autonomous invocation is allowed by platform default but does not combine here with broad credentials or other concerning privileges.
Assessment
This skill appears coherent for generating a PostHog metrics plan and code snippets. Before installing or running it, consider: (1) it will read project documentation files (PRD, CLAUDE.md)—ensure those files do not contain secrets you don't want read; (2) it will write docs/metrics-plan.md in the repo—be prepared to review/approve changes; (3) it does not request PostHog API keys or perform network calls, so it won't configure your PostHog project automatically — you'll need to provide credentials and perform deployment yourself; (4) review generated code snippets and thresholds before applying them in production (they are generic templates and may need adaptation). If you are uncomfortable with an automated agent reading project files, restrict or audit the skill's Read permission or run it in a sandboxed environment.

Like a lobster shell, security has layers — review code before you run it.

Runtime requirements

📈 Clawdis
latestvk97dh6k0kpda0vrr3a5fpesrqh81jkvk
599downloads
0stars
2versions
Updated 1mo ago
v1.1.1
MIT-0

/metrics-track

Set up a metrics tracking plan for a project. Defines PostHog event funnel, KPI benchmarks, and kill/iterate/scale decision thresholds based on lean startup principles.

MCP Tools (use if available)

  • kb_search(query) — find PostHog methodology, analytics patterns

If MCP tools are not available, fall back to Grep + Read.

Methodology Reference

This skill implements metrics tracking based on lean startup principles:

  • Relative metrics vs niche benchmarks — compare against your own trajectory, not vanity averages
  • Kill/iterate/scale decision rules — data-driven thresholds for product decisions (see step 7 below)

Steps

  1. Parse project from $ARGUMENTS.

    • Read PRD for features, ICP, monetization model.
    • Read CLAUDE.md for stack (iOS/Web/both).
    • If empty: ask via AskUserQuestion.
  2. Detect platform:

    • iOS app → PostHog iOS SDK events
    • Web app → PostHog JS SDK events
    • Both → cross-platform identity (shared user ID across platforms)
  3. Load PostHog methodology:

    • If MCP available: kb_search("PostHog analytics events funnel identity")
    • Otherwise: check project docs for existing analytics configuration
    • Extract: event naming conventions, identity resolution, funnel pattern
  4. Define event funnel based on PRD features:

    Standard funnel stages (adapt per product):

    Awareness → Acquisition → Activation → Revenue → Retention → Referral
    

    Map to concrete events:

    StageEvent NameTriggerProperties
    Awarenesspage_viewedLanding page visitsource, utm_*
    Acquisitionapp_installed or signed_upFirst install/signupplatform, source
    Activationcore_action_completedFirst key actionfeature, duration_ms
    Revenuepurchase_completedFirst paymentplan, amount, currency
    Retentionsession_startedReturn visit (D1/D7/D30)session_number, days_since_install
    Referralinvite_sentShared or referredchannel, referral_code
  5. Forced reasoning — metrics selection: Before defining KPIs, write out:

    • North Star Metric: The ONE number that matters most (e.g., "weekly active users who completed core action")
    • Leading indicators: What predicts the North Star? (e.g., "activation rate D1")
    • Lagging indicators: What confirms success? (e.g., "MRR", "retention D30")
    • Vanity metrics to AVOID: (e.g., total downloads without activation)
  6. Set KPI benchmarks per stage:

    KPITargetKill ThresholdScale ThresholdSource
    Landing → Signup3-5%< 1%> 8%Industry avg
    Signup → Activation20-40%< 10%> 50%Product benchmark
    D1 Retention25-40%< 15%> 50%Mobile avg
    D7 Retention10-20%< 5%> 25%Mobile avg
    D30 Retention5-10%< 2%> 15%Mobile avg
    Trial → Paid2-5%< 1%> 8%SaaS avg

    Adjust based on product type (B2C vs B2B, free vs paid, mobile vs web).

  7. Define decision rules (lean startup kill/iterate/scale):

    ## Decision Framework
    
    **Review cadence:** Weekly (Fridays)
    
    ### KILL signals (any 2 = kill)
    - [ ] Activation rate < {kill_threshold} after 2 weeks
    - [ ] D7 retention < {kill_threshold} after 1 month
    - [ ] Zero organic signups after 2 weeks of distribution
    - [ ] CAC > 3x LTV estimate
    
    ### ITERATE signals
    - [ ] Metrics between kill and scale thresholds
    - [ ] Qualitative feedback suggests product-market fit issues
    - [ ] One stage of funnel is dramatically worse than others
    
    ### SCALE signals (all 3 = scale)
    - [ ] Activation rate > {scale_threshold}
    - [ ] D7 retention > {scale_threshold}
    - [ ] Organic growth > 10% week-over-week
    
  8. Generate PostHog implementation snippets:

    For iOS (Swift):

    // Event tracking examples
    PostHogSDK.shared.capture("core_action_completed", properties: [
        "feature": "scan_receipt",
        "duration_ms": elapsed
    ])
    

    For Web (TypeScript):

    // Event tracking examples
    posthog.capture('signed_up', {
        source: searchParams.get('utm_source') ?? 'direct',
        plan: 'free'
    })
    
  9. Write metrics plan to docs/metrics-plan.md:

    # Metrics Plan: {Project Name}
    
    **Generated:** {YYYY-MM-DD}
    **Platform:** {iOS / Web / Both}
    **North Star:** {north star metric}
    
    ## Event Funnel
    
    | Stage | Event | Properties |
    |-------|-------|------------|
    {event table from step 4}
    
    ## KPIs & Thresholds
    
    | KPI | Target | Kill | Scale |
    |-----|--------|------|-------|
    {benchmark table from step 6}
    
    ## Decision Rules
    
    {framework from step 7}
    
    ## Implementation
    
    ### PostHog Setup
    - Project: {project name} (EU region)
    - SDK: {posthog-ios / posthog-js}
    - Identity: {anonymous → identified on signup}
    
    ### Code Snippets
    {snippets from step 8}
    
    ## Dashboard Template
    - Funnel: {stage1} → {stage2} → ... → {stageN}
    - Retention: D1 / D7 / D30 cohort chart
    - Revenue: MRR trend + trial conversion
    
    ---
    *Generated by /metrics-track. Implement events, then review weekly.*
    
  10. Output summary — North Star metric, key thresholds, first event to implement.

Notes

  • PostHog EU hosting for privacy compliance
  • Use $set for user properties, capture for events
  • Identity: start anonymous, identify() on signup with user ID
  • Cross-platform: same PostHog project, same user ID → unified journey
  • Review dashboard weekly, make kill/iterate/scale decision monthly

Common Issues

Wrong platform detected

Cause: Project has both web and iOS indicators. Fix: Skill checks package manifests. If both exist, it generates cross-platform identity setup. Verify the detected platform in the output.

KPI thresholds too aggressive

Cause: Default thresholds are industry averages. Fix: Adjust thresholds in docs/metrics-plan.md based on your niche. B2B typically has lower volume but higher conversion.

PostHog SDK not in project

Cause: Metrics plan generated but SDK not installed. Fix: This skill generates the PLAN only. Install PostHog SDK separately: pnpm add posthog-js (web) or add posthog-ios via SPM (iOS).

Comments

Loading comments...