How to Troubleshoot MMP Discrepancies (2025 Guide)

Fix attribution mismatches between your MMP and ad platforms. Learn why install numbers differ and how to reconcile data across systems.

Justin Sampson
How to Troubleshoot MMP Discrepancies (2025 Guide)

How to Troubleshoot MMP Discrepancies (2025 Guide)

You check your MMP dashboard and see 1,000 installs from Facebook.

You check Facebook Ads Manager and see 1,300 installs.

The numbers don't match. One platform is wrongor are they both right?

Data discrepancies between MMPs and ad platforms are normal. What's not normal is not understanding why they exist or which number to trust when making budget decisions.

Here's how to identify, troubleshoot, and resolve the most common attribution mismatches.

Why Discrepancies Happen

Attribution is not an exact science. Different platforms use different methodologies, time windows, and fraud filters.

The main sources of discrepancies:

Attribution windows: MMPs and ad platforms use different lookback periods for clicks and views Fraud filtering: MMPs filter fraudulent installs that ad platforms count as valid Conversion timing: When the install is recorded varies by platform Attribution methodology: Last-click vs multi-touch vs probabilistic models SKAdNetwork limitations: iOS privacy restrictions create inherent measurement gaps

A 10-20% discrepancy is typical. Anything above 30% suggests a configuration problem.

Common Discrepancy Scenarios

Scenario 1: MMP Shows Fewer Installs Than Ad Platform

This is the most common scenario.

Likely causes:

  1. Fraud filtering: Your MMP is removing invalid installs
  2. Attribution window mismatch: Ad platform uses longer window than MMP
  3. SDK implementation issue: MMP SDK isn't firing on all installs
  4. Delayed postbacks: MMP hasn't received all conversion data yet

How to diagnose:

Check your MMP's rejected installs report. If you see significant fraud filtering (click injection, SDK spoofing), that explains the gap.

Compare attribution windows in both platforms. If Facebook uses 7-day click / 1-day view and your MMP uses 7-day click / 0-day view, view-through installs won't match.

Scenario 2: MMP Shows More Installs Than Ad Platform

Less common, but it happens.

Likely causes:

  1. Multi-attribution: MMP attributes install to Facebook, but user clicked multiple ads and ad platform credited a different source
  2. Reattribution: MMP is counting reinstalls or reattributions
  3. SDK duplication: MMP SDK is firing multiple times for the same install
  4. Time zone differences: Reporting periods don't align

How to diagnose:

Check if your MMP allows multi-attribution or gives credit to multiple touchpoints. If so, one install might be counted under multiple sources.

Review reattribution settings. Some MMPs count reinstalls as new conversions.

Scenario 3: Conversion Events Don't Match

Install counts are close, but post-install event numbers diverge significantly.

Likely causes:

  1. Event mapping errors: Event names or parameters don't match between platforms
  2. Conversion value schema mismatch: SKAdNetwork conversion value mapping is incorrect
  3. Event timing: MMP records events in real-time, ad platform may batch or delay reporting
  4. Revenue tracking differences: Different revenue attribution logic

How to diagnose:

Check that event names are identical across platforms. "purchase" and "Purchase" are different events.

Verify SKAdNetwork conversion value schema matches between MMP and ad platform. Mismatched schemas create incorrect event attribution.

Step-by-Step Troubleshooting Process

Step 1: Define the Discrepancy

Before you can fix it, quantify it:

What metric is mismatched? Installs, conversion events, revenue? Which platforms? MMP vs Facebook, MMP vs Google, MMP vs TikTok? How large is the gap? 10%, 30%, 2x? Is it consistent? Does the discrepancy occur daily or intermittently?

Document the specific numbers and timeframes. "Facebook shows 1,300 installs for Jan 1-7, MMP shows 950 for the same period" is actionable. "The numbers don't match" is not.

Step 2: Check Attribution Windows

Attribution windows are the most common source of discrepancies.

Ad platform default windows:

  • Facebook: 7-day click / 1-day view
  • TikTok: 7-day click / 1-day view
  • Google App Campaigns: 30-day click / 1-day view
  • Apple Search Ads: 30-day tap

MMP default windows (varies by platform and configuration):

  • Adjust: 7-day click / 1-day view (default, configurable)
  • AppsFlyer: 7-day click / 1-day view (default, configurable)
  • Branch: 7-day click / 0-day view (common configuration)

If your MMP uses different windows than your ad platforms, discrepancies are expected.

Fix: Align attribution windows. Set your MMP windows to match your primary ad platforms, or accept the discrepancy as normal variance.

Step 3: Review Fraud Filtering

MMPs filter fraudulent installs. Ad platforms generally don't (or use much less aggressive filtering).

Check your MMP's fraud rejection reports:

  • What percentage of installs are being rejected?
  • Which fraud types are being filtered? (click injection, click spam, SDK spoofing)
  • Which sources have the highest rejection rates?

If 20-30% of installs from a specific source are rejected for fraud, that explains why MMP numbers are lower.

Fix: This isn't a problem to fixit's working as intended. Your MMP is protecting you from paying for fake installs. If rejection rates seem abnormally high (>50%), investigate whether your fraud filter settings are too aggressive.

Step 4: Validate SDK Implementation

If your MMP consistently shows 50%+ fewer installs than ad platforms across all sources, you likely have an SDK implementation issue.

Common SDK issues:

  • SDK not initialized before attribution call
  • SDK initialization wrapped in conditional logic that doesn't always execute
  • App crashes before SDK can send install event
  • Network requests blocked by app transport security settings
  • SDK version mismatch between iOS and Android

Fix: Review your SDK integration. Use your MMP's testing tools to validate that install events fire correctly in development and staging environments.

Check your MMP's raw event logs. If you see install events in logs but they're not appearing in dashboards, the issue is post-processing, not SDK implementation.

Step 5: Check Event Mapping

For conversion event discrepancies, event mapping is usually the culprit.

Verify:

  • Event names match exactly (case-sensitive)
  • Event parameters are mapped correctly
  • Revenue values are passed in the same currency and format
  • Events are sent at the same trigger point (e.g., both fire on purchase confirmation, not cart add)

Common mistakes:

  • MMP tracks "purchase_complete" but ad platform expects "purchase"
  • Revenue sent as cents in one platform, dollars in another
  • Event fired on button click in one platform, transaction completion in another

Fix: Standardize event schemas across all platforms. Create a source-of-truth event spec document and ensure every platform implementation matches it exactly.

Step 6: Account for SKAdNetwork Limitations

On iOS 14.5+, SKAdNetwork introduces measurement gaps that create unavoidable discrepancies.

Why this happens:

  • SKAdNetwork uses conversion values (0-63) to encode events, creating lossy compression
  • Postbacks are delayed 24-48 hours, creating timing mismatches
  • Not all installs result in SKAdNetwork postbacks due to user opt-out or technical failures

Expected impact:

10-30% of iOS installs may not attribute correctly through SKAdNetwork. This creates discrepancies between what your MMP sees (SKAdNetwork + deterministic attribution) and what ad platforms see (their internal tracking).

Fix: You can't eliminate SKAdNetwork discrepancies, but you can minimize them by ensuring your conversion value schema is correctly configured and matches between your MMP and ad platforms.

Step 7: Verify Time Zones and Reporting Periods

Simple but often overlooked: are you comparing the same time periods?

Check:

  • Time zone settings in MMP and ad platform dashboards
  • Reporting date range (Jan 1-7 vs Jan 2-8)
  • Whether platforms use install date or click date for reporting
  • UTC vs local time zone discrepancies

Fix: Standardize reporting on UTC or a single time zone. Always verify date ranges match exactly when comparing reports.

Platform-Specific Discrepancy Notes

Facebook / Meta

Facebook typically shows 10-20% more installs than MMPs due to:

  • More permissive attribution logic
  • View-through attribution that MMPs may filter
  • Delayed conversion reporting (Facebook may backfill conversions, MMPs report in real-time)

If the discrepancy is >30%, check attribution windows and event mapping.

TikTok

TikTok discrepancies are often larger (20-35%) due to:

  • Aggressive view-through attribution
  • Less sophisticated fraud detection on TikTok's side
  • Event mapping issues with TikTok's SDK

Ensure your TikTok event names exactly match your MMP event names, including capitalization.

Google App Campaigns

Google uses 30-day attribution windows by default, much longer than most MMPs' 7-day windows.

This creates significant discrepancies for apps with long consideration cycles. A user who clicks a Google ad on Day 1 but installs on Day 15 will count in Google but not in an MMP using a 7-day window.

Fix: Either extend your MMP's attribution window to 30 days for Google campaigns or accept the discrepancy.

Apple Search Ads

Apple Search Ads uses a 30-day tap window and has its own attribution API that MMPs integrate with.

Discrepancies are usually small (<10%) if the integration is properly configured.

If you see large gaps, verify that your MMP's Apple Search Ads integration is active and that the attribution API is properly connected.

When Discrepancies Are Acceptable

Not every discrepancy needs fixing.

Acceptable ranges:

  • 10-15% variance: Normal due to attribution methodology and fraud filtering
  • 15-25% variance: Expected for platforms with aggressive view-through attribution
  • 20-30% variance on iOS: Expected due to SKAdNetwork limitations

When to investigate:

  • >30% variance consistently: Indicates configuration issue
  • Sudden changes: If discrepancy was 10% and jumps to 40%, something changed
  • Single-source outliers: If Facebook matches within 10% but TikTok is off by 50%, TikTok integration has an issue

Which Number Should You Trust?

When MMP and ad platform numbers differ, which one is "correct"?

Short answer: Trust your MMP for optimization decisions.

Why:

MMPs provide cross-platform attribution, allowing you to compare performance across Facebook, TikTok, Google, and other sources in a unified view.

MMPs filter fraud, giving you cleaner data for budget allocation.

MMPs use consistent methodology across all sources, making cross-channel comparisons valid.

When to use ad platform numbers:

  • Platform-level optimization (e.g., which Facebook ad sets to scale)
  • Reporting to ad platform account managers
  • Debugging delivery or bidding issues within a single platform

Ad platforms always show higher install counts because they don't filter fraud and use more generous attribution logic. That's expected.

Use MMP data for strategic budget allocation. Use ad platform data for tactical campaign optimization within each channel.

Preventing Future Discrepancies

Once you've resolved current discrepancies, prevent new ones:

1. Standardize Configuration

Document and enforce:

  • Attribution window settings for each platform
  • Event naming conventions
  • Revenue tracking formats
  • Fraud filter settings

When launching new campaigns or platforms, reference this documentation to ensure consistency.

2. Implement Event Schema Validation

Use your MMP's event debugging tools to validate that events fire correctly before launching campaigns.

Create a testing checklist:

  • Install event fires on first app open
  • Purchase event fires with correct revenue value
  • Custom events match naming schema
  • SKAdNetwork conversion values map correctly

3. Monitor Discrepancy Trends

Set up weekly reporting that compares MMP vs ad platform numbers.

Track discrepancy percentage over time. If it suddenly changes, investigate immediately.

4. Document Known Gaps

Some discrepancies are unavoidable (SKAdNetwork, long attribution windows, view-through attribution).

Document these known gaps so stakeholders understand why numbers don't match perfectly.

FAQs

Why do my MMP and Facebook install numbers not match?

Attribution window differences are the most common cause. Facebook uses a 7-day click / 1-day view default, while many MMPs use different windows. Fraud filtering, conversion timing, and attribution methodology (last-click vs multi-touch) also create discrepancies of 10-30%.

What is an acceptable discrepancy range?

A 10-20% discrepancy between MMP and ad platform numbers is normal due to attribution window differences and fraud filtering. Discrepancies above 30% indicate configuration issues that need investigation.

Should I trust my MMP or ad platform numbers?

Trust your MMP for cross-platform optimization and fraud-filtered data. Ad platforms will always show higher numbers because they don't filter fraud and use more permissive attribution logic. Use MMP data for budget decisions.

How do I fix SKAdNetwork discrepancies?

You can't eliminate SKAdNetwork discrepancies entirelythey're inherent to iOS privacy restrictions. Minimize gaps by ensuring your conversion value schema is correctly configured and matches between your MMP and ad platforms.

What if my MMP shows more installs than my ad platform?

This suggests multi-attribution (MMP crediting one source while ad platform credits another), reattribution of existing users, or SDK duplication issues. Check reattribution settings and verify your SDK isn't firing multiple install events.


Data discrepancies are frustrating but manageable. By understanding why they occur and following a systematic troubleshooting process, you can identify legitimate issues and separate them from expected variance. Focus on consistency and use your MMP as the source of truth for strategic decisions.

MMPattributiondata discrepancytroubleshootingmobile analytics

Related Resources