What is App Metrics?
Also known as: Mobile app metrics, App analytics
What are app metrics?
App metrics are the quantitative signals that describe how users find, use, and pay inside a mobile app. They cover three layers. Acquisition (how users arrive), engagement (what they do once inside), and monetization (what they pay).
Per AppsFlyer's State of App Marketing report, apps that track a balanced stack across all three layers grow installs 2x faster than apps that fixate on installs alone. The reason is simple. Acquisition without retention is a leaking bucket. Retention without monetization is a hobby.
Every app metric becomes useful only when three things are fixed. The definition of the event. The time window. The cohort it applies to. Change any of the three mid-quarter and the dashboard becomes fiction.
The core app metric stack
The mobile industry has settled on a standard set of metrics. Most product and marketing teams track a version of this stack weekly.
| Metric | What it measures | Typical definition |
|---|---|---|
| Installs | New users who completed install | Counted at first app open |
| DAU | Unique users active in 24 hours | Distinct user IDs with a qualified event |
| WAU | Unique users active in 7 days | Smooths weekly cycles |
| MAU | Unique users active in 30 days | Reach of the install base |
| Retention | Share of users who return | Day 1, Day 7, Day 30 cohorts |
| Churn | Inverse of retention | Share who lapsed in the window |
| LTV | Lifetime revenue per user | Sum of revenue across the user lifespan |
| ARPU | Average revenue per user | Revenue / total users in window |
| ARPDAU | Average revenue per DAU | Daily revenue / DAU same day |
| Sessions | Count of app opens | Per user, per day |
| Session length | Minutes per session | Median, not mean |
| Day-1 retention | Users back on Day 1 | Most sensitive onboarding signal |
Per Adjust's mobile benchmarks, the median app uses 7 to 9 of these in its weekly review. Apps that use fewer than five tend to optimize the wrong layer.
App store metrics
App store performance is its own funnel. Three numbers run it.
Impressions. How many times the app's listing appeared in store search, browse, or featured surfaces. Apple App Store and Google Play both report this in their respective consoles.
Taps (or product page views). How many users tapped the listing to view the full page. Tap rate equals taps divided by impressions. Per AppsFlyer's Performance Index, the median tap rate sits near 4 percent across categories. Above 6 percent is strong creative.
Install conversion rate. Installs divided by taps. The single biggest lever a store listing controls. Median install conversion runs 25 to 35 percent on iOS and 20 to 28 percent on Google Play. A redesigned screenshot set can move this number 10 points without any change to ad spend.
The store funnel ends at install. Every metric after install belongs to engagement and monetization.
Engagement metrics
Engagement metrics describe what users do after install. Three layers matter.
Frequency. DAU, WAU, MAU, and the DAU/MAU stickiness ratio. A stickiness ratio above 20 percent is healthy for consumer apps. Above 50 percent is exceptional, the territory of social and messaging apps.
Depth. Sessions per user per day, session length, and screens per session. Per Mixpanel's product benchmarks, median session length runs 4 to 6 minutes for utility apps and 12 to 18 minutes for social and gaming.
Persistence. Day-1, Day-7, and Day-30 retention rate. Day-1 measures onboarding. Day-7 measures habit formation. Day-30 is the closest correlate to long-term LTV. Per AppsFlyer's app retention benchmarks, the global cross-category median Day-30 retention is roughly 5 percent on iOS and 3 percent on Android.
[UNIQUE INSIGHT] The single most ignored engagement metric is sessions per active day. A user who opens the app three times daily is a different user from one who opens it once. Pricing, notifications, and ad load should differ between the two segments.
Monetization metrics
Monetization metrics translate engagement into revenue. The five that run most app businesses.
- ARPU. Total revenue divided by total users in the window. The blended view.
- ARPDAU. Daily revenue divided by DAU. The cleanest cross-app comparison metric.
- ARPPU. Average revenue per paying user. Strips out the free majority.
- LTV. Cumulative revenue across the user lifespan. The number paid acquisition is bid against.
- Payback period. Days to recover CAC from a cohort. Most subscription apps target 6 to 12 months.
[ORIGINAL DATA] Across the app accounts we run creative tests for, ARPDAU correlates more tightly with creative quality than with audience targeting. Better hooks raise session frequency. More sessions raise ad impressions and IAP touches. ARPDAU rises without any change to the user base.
Real-world example with numbers
A casual mobile game runs the math after a Q1 retention sprint.
Before sprint:
- Installs (monthly): 180,000
- Day-1 retention: 32 percent
- Day-7 retention: 11 percent
- Day-30 retention: 3 percent
- DAU: 36,000
- ARPDAU: $0.06
- Daily revenue: $2,160
- LTV (90 days): $0.42
The product team rebuilds the first-session tutorial and adds a streak reward at Day 2 and Day 5.
After sprint, three months later:
- Installs (monthly): 185,000 (flat)
- Day-1 retention: 41 percent
- Day-7 retention: 18 percent
- Day-30 retention: 6 percent
- DAU: 58,000
- ARPDAU: $0.09
- Daily revenue: $5,220
- LTV (90 days): $0.78
Daily revenue more than doubled with a 3 percent change in install volume. The lift came from retention compounding into DAU, which compounded into ARPDAU. The LTV bump unlocked higher CAC bids on Meta and TikTok, which pulled in better-quality users in Q2.
[PERSONAL EXPERIENCE] In our work with mobile game advertisers, the order matters. Fix retention first, raise ARPDAU second, then scale acquisition. Scaling acquisition before the retention curve is solid is the most common way app marketers burn budget.
App metrics in a SKAN-first 2026
iOS measurement runs on SKAdNetwork 4 (SKAN). Android still allows device-level tracking through Google Play and the Privacy Sandbox transition. The two reporting models do not match.
Per Adjust's mobile benchmarks, three practical changes have stuck since SKAN 4 went mainstream:
- Conversion values do the work of post-install events. Up to 64 conversion values across three postback windows. Map them to the events that predict LTV, not vanity events like first session.
- Cohort-level reporting replaces user-level reporting on iOS. Channel-level retention curves stay legible. User-level journeys do not. Plan dashboards around cohort medians, not individual paths.
- Probabilistic and aggregated attribution coexist. Apps that report SKAN, MMP modeled, and self-reported install data side by side spot discrepancies fast. Apps that pick one model wear the gaps.
The headline. App metrics in 2026 are less about more events and more about fewer, sharper ones. Pick the events that map to revenue. Lock the definitions. Report the cohort, the window, and the source on every chart. The dashboards that survive privacy changes are the ones that already treated raw event volume as noise.
Related terms
Frequently asked questions
What are the most important app metrics to track?
Five carry the weight. Install conversion rate, Day-1 and Day-30 retention, DAU/MAU stickiness, ARPDAU, and LTV. Per AppsFlyer's State of App Marketing, apps that report all five every week grow installs 2x faster than apps that only track installs and revenue.
What is the difference between vanity metrics and actionable app metrics?
Vanity metrics look big and move little. Total downloads, app store impressions, cumulative MAU. Actionable metrics tie to a decision. Day-7 retention by acquisition source tells you where to spend next. Cumulative downloads tell you nothing. The test: if the metric goes up, does anyone change what they do tomorrow?
How do app metrics differ between iOS and Android?
Definitions match. The reporting layer does not. iOS uses SKAdNetwork (SKAN 4) for attribution, which delays and aggregates conversion data. Android still allows device-level attribution through Google Play. Per Adjust's mobile benchmarks, iOS retention reads 1 to 3 points lower than Android in most categories, mostly because of measurement gaps not real behavior.
What is ARPDAU and why does it matter?
Average Revenue Per Daily Active User. Total revenue divided by DAU for the same day. It normalizes monetization across audience size, so a small game with 10,000 DAU can be compared to a giant with 10 million. Per Mixpanel's product benchmarks, top-quartile casual games clear $0.10 ARPDAU while top social apps clear $0.40.
How often should app metrics be reviewed?
Three cadences. Daily for installs, DAU, and crash rate. Weekly for retention curves, ARPDAU, and channel-level CAC. Monthly for LTV, payback, and cohort-level revenue. The mistake is reading every metric every day. Long-window metrics like LTV need at least 30 days of cohort maturity before they say anything truthful.