Blog Discrepancies in data: Why don’t the num...

Discrepancies in data: Why don’t the numbers always match up?

Accuracy is essential in mobile attribution. When receiving data from multiple platforms, channels, and campaigns, however, the numbers don’t always add up. On one dashboard, you have 5,005 installs; on another, 7,246. Why are the numbers different? At Adjust, we know data accuracy is paramount for mobile marketers. Therefore, we’ll look at common discrepancies across platforms before zoning in on issues only found on certain vendors, including Google, Apple, and Meta/Facebook.

What is data discrepancy?

Data discrepancy occurs when two or more comparable data sets don’t align. For example, app marketers may note a numerical difference for a metric on one platform versus another. Different dashboards present data differently, and the reasons may vary on why the data does not match.

How do you identify discrepancies in data?

Identifying discrepancies in data is simple. You compare two data sets for the same period of time and look for numbers that don’t match up. The real challenge is understanding what caused the discrepancies and how to reconcile them. It’s essential to do this quickly and confidently, as most marketers operate within thin margins and need to make budget allocation decisions easily.

Reasons for data discrepancies

There are two main questions to address. How do you identify discrepancies in data? How do you resolve data discrepancies? Below we detail answers to these questions through common scenarios that lead to discrepancies. These are seen across all platforms and are mainly due to differences in how systems operate and report. We’ll also explain why you sometimes see discrepancies between Adjust data and data from platforms and networks where you’re running campaigns.

  1. Downloads vs. installs

    There’s a fundamental difference between a download and an install, but often they’re compared, which can lead some to think that there are discrepancies between Adjust’s numbers and those on other platforms. So let’s look at what makes them incomparable.

    Downloads represent when a user downloads an app from a store. An install is an event after a download, constituting the first time a user opens an app.

    Adjust tracks installs, whereas store owners like Apple and Google track both downloads and installs. We only look at installs for two reasons. First, because we don’t have access to Google and Apple’s download data. As such, we gauge installs because this is information we measure ourselves.

    Second, Adjust will only be able to track conversions when our software development kit (SDK) is activated by the user when the app is first opened. Once the SDK is fired for the first time, it lets us know that a new install has occurred.

    Discrepancies can occur when apps are downloaded but not opened. Other reporting inconsistencies can occur for the time of install, as Adjust records the time of install as the moment the app is opened, whereas Apple and Google will record the time of download.

    With this issue, it’s a case of remembering that downloads on any platform do not equal installs on Adjust’s. So, in short, make sure you’re comparing install-to-install, not downloads-to-install.

  2. User-based versus device-based installs

    Both Apple and Google count their installs based on the user's store account, whereas Adjust bases installs on the individual device-level advertising IDs.

    In this case, a discrepancy can occur when a user owns both an iPhone and an iPad and installs the same app on both devices. Here, Adjust would count two installs because we are receiving data from two advertising IDs. Apple would count one install, as the user has the same store account on each device.

    Additionally, when working with SKAdNetwork, we never receive the IDFA, so attribution is done at an aggregated level. For more information, check out SKAdNetwork 4.0 explained and Adjust’s privacy vision for 2022 and beyond.

  3. Time zones and geolocation

    The way Adjust works with a user's location (and, as such, their time zone) often differs from how several other platforms function.

    In regards to the location of users, Apple and Google base their data on the geolocation of the user's store account, while Adjust looks at the IP address of the user at the time of install.

    So, if a user has a UK app store account but is in Germany when they install an app, then Apple and Google record the user's downloads/installs to the UK, while Adjust will attribute the install to Germany. As such, you’ll sometimes see differences between the app stores and Adjust regarding time of install and location.

    Regarding users' geolocation, it depends on what you wish to monitor—the user’s store account or the location of the install. When it comes to time zones, Adjust measures according to Coordinated Universal Time (known as UTC). Other platforms may differ by time zones. For instance, Google Ads works on PST. You can either change the time zone within the Adjust dashboard or other platforms' time zones to match UTC.

  4. App update effects

    This discrepancy often affects newer Adjust clients who have had apps available to install for some time.

    If an app did not initially launch in the stores with our SDK but added it later, all “old" users who update the app will be tracked as new users by Adjust. Apple and Google will merely note an update.

    This leads to a spike in the first couple of weeks of the update (potentially months depending on the age/popularity of the app), with a sharp decline towards more realistic numbers. If your app has a large user base prior to using Adjust, it will likely take longer for the effect to balance out.

    If you have a dedicated account manager as part of your Adjust package, they will inform you of this when you first install the SDK, but it is something to keep in mind.

  5. Third-party store installs

    If you distribute your app (an APK with Adjust’s SDK integrated) beyond either the Google Play Store or App Store (for example, in a third-party store), Adjust will count those installs. Apple and Google will not.

    This often has a bigger effect on creating discrepancies for Android applications — where there are several competing stores beyond the Play Store. Ultimately, Adjust’s data creates a more holistic picture of all activity in this instance.

  6. Comparing events

    Google Ads, Facebook, and Adjust all use different attribution approaches for post-install events, which make the numbers on both sides not quite comparable:

  • Google Ads assigns events to the source of click and has a 30-day event attribution window by default.
  • Facebook assigns events to the source of click and has a 28-day event attribution window by default.
  • Adjust assigns events to the source of install or reattribution. We also don’t have an event attribution window. Instead, events are attributed to the user's source of install indefinitely (or up until the point the user may be reattributed.) From the point of reattribution, any subsequent events triggered by the user will be assigned to the source of retribution.

Let’s look at an example — with Meta/Facebook, if a user installs via "Facebook Campaign A" and then clicks on an ad from "Facebook Campaign B" triggering an event, on Adjust all events will be attributed to "Facebook Campaign A" as its the source of install.

However, on Facebook, because they saw a click more recently from that user from "Facebook Campaign B" they would assign the event to "Facebook Campaign B." This can create discrepancies between campaigns on each platform. Google Ads works in the same way.

When fraud causes discrepancies: The main mobile fraud types

Mobile fraud is a significant cause of discrepancies in data sets. Below you’ll find a summary of the top mobile fraud types that ruin datasets and how Adjust handles them.

  1. SDK spoofing

    This type of mobile fraud generates legitimate-looking installs without genuine installs occurring. In SDK spoofing, fraudsters replicate how SDKs communicate to make up not only installs but engagements and events too.

    At Adjust, we offer an SDK Signature, providing multi-layered encryption that protects your data sets against SDK spoofing.

  2. Click injection

    Click injection occurs when an app is downloaded and fraudsters inject a click. In doing so, the fraudsters are taking credit for the install. With click injection fraud, you’ll end up paying for performance that isn’t there.

    Adjust combats click injection with our Click Injection Filter. This filter utilizes deterministic timestamps to prevent attribution to fake engagements.

  3. Device farms

    Device farms and data centers emulate devices to imitate engagements and installs. They do so by obfuscating their locations or IPs. While device farms have been around for some time, they still exist, and are, therefore, still a threat to your data sets.

    Adjust offers our Anonymous IP filter, which rejects any and all installs that are associated with compromised IPs.

  4. Click spamming

    Often, fraudsters level malware on user devices to send clicks to mobile measurement platforms. Then credit is misattributed to publishers for installs that were organic.

    At Adjust, we fight click spamming with our solution, Distribution Modeling. Our models are fed with years of research and data to help determine which clicks are real and which are fake.

How to resolve data discrepancies by platform

We’ve conquered discrepancies that affect everyone — so it’s time to look at more platform-specific issues. Meta/Facebook, Google, and Apple all work with us in different ways, so below, we’ll cover each platform and look at why the numbers sometimes appear unequal.

Meta/Facebook

The types of discrepancy that appear on Meta/Facebook can be broken into three sections: (1) whether Facebook’s data shows more, (2) whether it shows less, and (3) how re-engagement is calculated.

When Facebook shows higher numbers

Per default, Facebook measures on a 28-day post-click, 24-hour post-view basis.  Essentially, these attribution settings can and should be modified in the Facebook interface to reach a better basis for comparison between the platforms. Per our default, we provide 7-day last click-only attribution for Facebook, not 28-day. This means that Facebook’s numbers include an extra three weeks of data by comparison.

To help solve this discrepancy, take note of the attribution window you have set on Adjust. For example, we do provide the option to change to a maximum 30-day attribution window. Then, on Facebook’s reporting, you can choose which window you want to view. The data should refresh according to that window and align more accurately with the Adjust data.

When Facebook shows lower numbers

Only a quick point here: on Facebook you’ll know that each ad account has a separate dashboard while Adjust aggregates the data for all ad accounts.

If your numbers are lower on Facebook, make sure that you check your reporting from every ad account and combine them into one to see if there is a difference.

Facebook, Adjust, and re-engagement

Facebook approaches re-engagement differently, at least compared to Adjust.

If you’re using Adjust to track re-engagement, you’ll know we operate with a user-based model. That means one reattribution equals one user. We count one reattribution when an existing user engages with an ad and re-opens (or is deep linked into) the app after being inactive for a specific period of time. By default, this is a seven-day period.

So, if an existing user is inactive for seven days, but then clicks on a re-engagement ad in Facebook to re-open the app, Adjust will count one reattribution. If this same user later engages with the same ad again (and re-opens the app once more), this will not count as another reattribution, but instead as another session.

With Facebook, an engagement is counted based on event. For instance, if a user clicks on a re-engagement ad and engages in the app within 28 days (the event attribution window of Facebook), this will be counted as an engagement for this ad. If later, the same user clicks on the same ad and engages in the app multiple times, Facebook will count multiple engagements.

Apple Search Ads

Apple visibility into last-click attribution

Apple doesn't know if a user clicked on an ad served by another partner in-between a click on an Apple Search Ad and an install.

In addition, a user could click on an ad from Apple and install an app but not open it. Later, the user could click a different ad from a different source and open the app afterward. In this case, we'd attribute the install to the last click, whereas Apple would claim the install happened from their ad.

  • Last click attribution windows

    Adjust’s standard last-click attribution window is 7 days, whereas Apple Search Ads is fixed at 30 days.

  • Time zones

    Apple Search Ads reporting time zone is based on your account’s location, while Adjust reports in UTC.

  • Re-install

    If an existing user uninstalled an app but later clicks on an Apple Search ad and re-installs the app, Apple will count this as a new install, but Adjust only counts this as a session.

    At Adjust, we have a solution that utilizes an Adjust internal ID. This is to avoid counting a new install when users who have LAT switched on delete and re-install an app. We can instead count the re-install as a session. It is advisable to keep this in mind when comparing Adjust data with a platform that may not have such a method of device recognition in place and instead counts re-installs from users with LAT switched on as new installs, leading to a discrepancy.

  • Limit ad tracking

    When a user has enabled Limit Ad Tracking (LAT) on their device, we do not receive a response from Apple’s Attribution API, and so we’ll attribute the user as organic or to another source that has registered a click. However, Apple still claims these users and their statistics will reflect these installs. Since the introduction of App Tracking Transparency (ATT) the number of users limiting tracking has increased dramatically, as only 4% of users enable tracking.

Google Play

If you’re looking at the "Installs" metric on the Google Play Store, remember that they include many variations on install metrics, none of which directly relate to Adjust. If you contact us, we can work with you to find a way to directly compare the two, as this often needs a bit more support depending on your app business.

Google Ads discrepancies

  • Different attribution windows

    Google Ads works on a 30-day attribution window. Adjust’s attribution window is 7 days by default, though this can be changed.

  • Remarketing

    Comparing remarketing results of Adjust and Google Ads is difficult and something we don’t recommend.

Adjust counts reattributions after a user (who has come through a Google Ads remarketing campaign) has been brought back into an app via deep link reattribution. Meanwhile, Google Ads uses events-based reattribution, which is just as technical but different from the way we calculate.

  • Events

    As with remarketing, we don’t recommend comparing events either.

    This is because Google Ads has a 30-day event attribution window. As such, if they see a click on an ad within 30-days prior to the event, then they assign the event to the source of the click.

    Specifically, with events, Adjust assigns events to the source of the install (or reattribution) for the lifetime of the user.

    It’s important to remember that we’re using different methods of event attribution that produce different results. Therefore, you can set up event attribution methods based on your preferences.

Final tip: Get all your data in one place

Mobile measurement for app marketing can seem overwhelming with multiple platforms, channels, ad networks, and campaigns to manage. In addition, comparing performance by switching in and out of dashboards is time-consuming and leads to data discrepancies.

That’s why we recently launched our latest data analytics solution, Datascape. You can filter through and visualize an unlimited number of data sources in one screen, easily switching from actionable overviews to granular drill downs across time and apps. Our data accuracy and automation tools allow you to confidently and quickly make the best decisions to drive your ROI forward. If you’d like to see Datascape in action, request your demo.

Be the first to know. Subscribe for monthly app insights.