Why Your Install Numbers Don't Match (And That's Normal)

Attribution discrepancies between platforms are expected. Learn why Facebook, your MMP, and the App Store show different install counts.

Justin Sampson
Why Your Install Numbers Don't Match (And That's Normal)

Why Your Install Numbers Don't Match (And That's Normal)

You check Facebook: 1,000 installs.

You check your MMP: 850 installs.

You check the App Store: 1,200 downloads.

Which number is right?

All of them. And none of them.

Attribution discrepancies aren't errors—they're an inherent characteristic of how mobile measurement works. Understanding why numbers differ helps you make better decisions despite the inconsistencies.

The Core Issue: Different Systems Count Different Things

Each platform defines "install" slightly differently:

App Store / Google Play:

  • Counts every download
  • Includes re-downloads by the same user
  • Includes downloads that never opened
  • No attribution to marketing source

Ad Networks (Facebook, TikTok, etc.):

  • Count installs they attribute to their ads
  • Use their own attribution logic
  • May count probabilistic matches
  • Report by click date, not install date

Your MMP:

  • Counts installs with successful attribution
  • Deduplicates across networks
  • Applies fraud filtering
  • Reports by install date
  • Only counts first installs per device

These aren't measuring the same thing, so identical numbers would actually be suspicious.

Reason 1: Attribution Methodology Differences

Different attribution models lead to different results.

Last-click attribution (most MMPs):

  • Credits the last ad clicked before install
  • If a user clicks Facebook, then TikTok, then installs: TikTok gets credit

Self-attribution (ad networks):

  • Networks credit themselves if their ad was clicked within the lookback window
  • If a user clicks Facebook, then TikTok, then installs: Both might claim credit

Facebook might report the install. TikTok might report it. Your MMP only counts it once, giving it to TikTok.

Result: Ad networks report 150 installs combined, MMP reports 100 installs total.

Reason 2: Timing Differences

Platforms report based on different timestamps.

Ad networks:

  • Often report by click date or impression date
  • An install on January 5 from a January 3 click gets reported on January 3

MMPs:

  • Report by install date
  • The same install gets reported on January 5

This creates rolling discrepancies that shift daily but balance out over time.

Reason 3: Lookback Windows

Attribution windows determine which interactions get credit.

Example scenario:

  • User clicks ad on Day 1
  • User installs on Day 8

Network with 7-day lookback: Doesn't count the install.

Network with 28-day lookback: Counts the install.

Your MMP with 7-day lookback: Doesn't count the install.

If lookback windows aren't aligned, you get systematic discrepancies.

Reason 4: Fraud Filtering

MMPs filter installs that look fraudulent. Ad networks may or may not.

Common fraud patterns MMPs block:

  • Install farms (mass installs from data centers)
  • Click flooding (fake clicks trying to steal attribution)
  • Emulator installs
  • Bot traffic

Facebook might report 100 installs. Your MMP filters 15 as fraud. You see 85 installs.

The discrepancy indicates your MMP is protecting you from paying for fake installs.

Reason 5: Re-installs and Re-downloads

App stores count re-downloads. MMPs typically don't.

If someone:

  1. Downloads your app
  2. Deletes it
  3. Re-downloads it a week later

App Store count: 2 downloads

MMP count: 1 install (only the first counts)

The App Store number will always be higher than your MMP number due to this.

Reason 6: Organic vs. Paid Separation

Different systems handle organic installs differently.

MMP logic:

  • Paid install: Attributed to a specific ad click
  • Organic install: No ad interaction within lookback window

Ad network logic:

  • May claim attribution to view-through (user saw ad but didn't click)
  • May use longer attribution windows
  • May include assisted conversions

An install your MMP calls "organic" might appear in Facebook's paid report as a view-through conversion.

Reason 7: SKAN Delays and Aggregation

iOS installs measured through SKAN introduce additional discrepancies.

SKAN characteristics:

  • 24-72 hour delay in reporting
  • Crowd anonymity can drop attribution data
  • Some installs never generate postbacks

SKAN installs appear in your MMP days after they show up in the App Store download count, creating temporary mismatches that resolve over time.

Reason 8: Technical Implementation Issues

Configuration problems create larger discrepancies:

  • MMP SDK not implemented correctly
  • Postback URLs configured wrong
  • Attribution links not set up properly
  • Events not firing correctly

These create systematic undercounting in one direction.

What's Normal vs. What's a Problem

Normal discrepancies (5-15%):

  • Different attribution windows
  • Timing differences (click date vs. install date)
  • Fraud filtering
  • Platform methodology differences

Problem discrepancies (20%+ or systematic):

  • SDK not firing correctly
  • Postback configuration errors
  • Significant fraud
  • Attribution link issues

Small discrepancies are expected. Large, consistent gaps indicate technical problems.

How to Diagnose Large Discrepancies

When numbers diverge significantly:

1. Check attribution window alignment

  • Verify your MMP and ad networks use the same lookback windows
  • Align click windows (typically 7 days) and view windows (typically 1 day)

2. Verify SDK implementation

  • Test install tracking in development
  • Confirm MMP SDK fires on app open
  • Check SDK version is current

3. Review postback configuration

  • Confirm postback URLs are correct for each network
  • Test postbacks using network validation tools
  • Check that all required parameters are mapped

4. Compare time frames

  • Check if the discrepancy disappears when comparing identical date ranges
  • Account for SKAN delays (compare iOS data 3+ days later)

5. Analyze fraud rates

  • Check your MMP's fraud dashboard
  • Look for patterns (geographic concentrations, suspicious IP ranges)
  • Review rejection reasons for blocked installs

Which Number to Use

Different numbers serve different purposes:

Use your MMP as source of truth for:

  • Budget allocation decisions
  • ROAS calculations
  • LTV analysis
  • Cross-channel comparison

Use ad network numbers for:

  • Campaign optimization within that platform
  • Understanding platform-specific performance
  • Verifying postbacks are arriving

Use App Store numbers for:

  • Total download trends
  • Organic vs. paid ratio estimation
  • Seasonal pattern analysis

Your MMP provides the most consistent, deduplicated view across all channels. Use it for decisions.

Reducing Discrepancies

You can't eliminate discrepancies, but you can minimize them:

  • Align attribution windows across platforms
  • Implement MMP SDK correctly
  • Test postback flows regularly
  • Use consistent conversion event definitions
  • Update SDKs when new versions release
  • Monitor fraud rates

Even with perfect configuration, expect 5-10% variance. That's not a bug—it's how mobile attribution works.

FAQs

Why do install numbers differ across platforms?

Install numbers differ because of different attribution methodologies, timing windows, fraud filtering, and what each platform counts as an install. This is normal and expected in mobile app marketing.

What's an acceptable discrepancy rate?

Discrepancies of 5-15% between ad networks and MMPs are normal. Larger gaps (20%+) indicate configuration issues, fraud problems, or attribution window mismatches.

Which install count should I trust?

Use your MMP as the source of truth for decision-making. It provides deduplicated data across all networks and applies consistent attribution logic. Ad network counts are useful for optimization but not for absolute measurement.

Why does the App Store show more installs than my MMP?

The App Store counts every download, including re-installs, installs that never opened, and organic installs. Your MMP only counts first installs that it successfully attributed. The App Store number will always be higher.

Should I optimize based on ad network data or MMP data?

Use MMP data for budget decisions and cross-channel optimization. Use ad network data for platform-specific optimization (creative testing, audience targeting) since it's what their algorithms use for bidding.


Attribution discrepancies are frustrating but inevitable. Focus on trends within a single source of truth (your MMP) rather than trying to reconcile numbers across platforms. Small variances don't prevent good decision-making.

attribution discrepancyinstall mismatchMMPdata accuracytracking

Related Resources