What's pertinent in publisher IDs?
It’s hard to keep chipping away at large amounts of data when your gut instinct tells you there’s nothing there. Publisher IDs tend to be exactly this sort of trade-off. With persistence and smart analysis, surprising insights can pop out and give invaluable targeting insights for a specific app.
We recently published a case study at Gamasutra, where we took a closer look at Hardscore Games’ Star Admiral. Using app store genre data to segment incoming traffic from in-app advertising, we used the data to determine which genre of gaming apps Hardscore should advertise Star Admiral in to get loyal users.
A key point is that data such as publisher IDs need to be aggregated to give us fewer, but more meaningful segments. In this case we hooked up the list of app IDs to our apptrace database and crunched out the gaming subgenre for each of the publishing games. The results of the case study are discussed in more detail over on Gamasutra, but I wanted to expand on the technicalities of the analysis itself here.
The first step in the process is to define a strong metric that suits both the type of analysis and the specific game. In the soft launch stage, it’s crucial to get strong early adopters. In the case of Star Admiral, we segmented the samples rather narrowly so we need a metric that was sufficiently voluminous to be statistically reliable, like sessions. The adjust ‘session’ is calculated as a series of activities separated by 30 minutes to the next.
Then we run the analysis within the scope of cohort analysis, so that all activities are normalized to the number of days after install. This filters out any interference from differing user lifespans. This is because over time the relative proportion in acquisition of those segments may change, which with the wrong technique creates different and unrelated trends in the data that we’re looking at.
Finally, to correct for varying sample sizes in different segments, divide the sessions by the number of currently active users for that day number after-install (for example, day seven after install). This metric reflects engagement in the surviving users and therefore does not encompass churn. If you want the metric to adapt for churn, run the metric with number of installs, i.e. total cohort size, as a base. What we’re doing here though is breaking down the different components within the given metric, to try and focus on a single, simple truth.
With that given, calculate sessions-per-active-user for each cohort week. This becomes our single metric, and should solely reflect the trend in engagement for the different genres. As a final step we calculate the metric in relative terms, where each week sums up to one, for fast comparison.
Now in clear text, it’s surprisingly obvious that Star Admiral’s best audience was found in quiz or puzzle games, not other strategy games. As we discussed in Gamasutra, the audience that Star Admiral works to attract and the niche in he market that this newly launched game occupies are different from other games that at a superficial level might appear to be similar.
Knowing that the underlying method is rigorous allows satisfactory, analysis based conclusions to be drawn. In part, this requires tools that can perform the essential crunching such as cohort analysis. The magic is that once you know which tools to use, the neatly organized data will reveal their true form.