Blog Demystifying Cohort KPIs, part III: Trac...

Demystifying Cohort KPIs, part III: Tracking custom user journeys with Event KPIs

If you’ve been following our Demystifying Cohort KPIs series, then you’ve probably built a pretty solid understanding of how you can leverage adjust’s cohort analysis for your own benefit. Over the past few weeks, we’ve reviewed Cohort KPIs that will help you understand your revenue, as well as discussed the basic metrics for reviewing cohorts. In this post, we’ll be looking at Event KPIs, which give you valuable insight into the user behavior of your cohorts.

If you haven’t already, be sure to read Part I of the series to build a fundamental understanding of cohorts, and Part II for a more advanced look at reading and understanding cohort reports.

In-depth looks at user behavior with event cohorts and conversion KPIs

So, why should you care about Event Conversion KPIs? When you combine Event Conversion KPIs with a set of meaningful custom events inside your app, you are rewarded with metrics that will help you optimize the user journey. With adjust, you can build a cohort-based on specific triggered events, resulting in cohorts made up solely of users who exhibited a specified behavior. This allows you to dissect the behavior of such users through close analysis. So how will these cohorts look, and how can you make sense of them?

A quick refresher: the makeup of our model cohort

To keep things consistent, we’ll reuse the parameters from the last two installments of our series and observe the same users in most of this week’s calculations. Remember that our model is built around today being Monday, with three users having installed our app last week (ending with yesterday, Sunday). To make it easier, we even named them: Alice installed our app on Tuesday, Bob on Friday and Charley on Sunday (the last day).

Here’s the retention table for this cohort:

Segment \ Day After Install 0 1 2 3 4 5 6 7
Last week’s users 3 3 2 2 1 1 1 0

Cohort Size for our self-designed cohort

As we calculate the metrics for this post, we’ll gain some better insight into Alice, Bob and Charley’s behaviors. With this in mind, let’s start looking at our KPIs.

Event KPIs explained

So, here we are. All the KPIs for this week’s article are laid out below. Grab some coffee, pick up a pencil, and get ready – by the end of this list, you’ll be an expert.

For simplicity’s sake, we will deal with just one event in our examples throughout. In practice, you can use multiple events for each of the KPIs here.

Converted Users

Let’s say the event we want to count as a conversion is completed a level. Converted Users states the number of unique users in the cohort who completed a level for each day-after-install. Let’s say only Alice and Bob triggered this event:

DOW \ DAI 0 1 2 3 4 5 6 7
Mon 0 0 0 0 0 0 0 0
Tue 0 0 1 0 1 0 0 na
Wed 0 0 0 0 0 0 na na
Thu 0 0 0 0 0 na na na
Fri 1 0 1 0 na na na na
Sat 0 0 0 na na na na na
Sun 0 0 na na na na na na

Converted Users per install date

As we see, Alice completed a level on Day 2 after-install and then again on Day 4 after-install. Bob completed one on his install day, and again on Day 2 after-install.

Tracker \ DAI 0 1 2 3 4 5 6 7
Converted Users 1 0 2 0 1 0 0 0

Converted Users, cohorted

Note that if a user triggers an event more than once in a single day, their contribution to the Converted Users metric is still only 1. However, they are included as converted users for all days-after-install on which they triggered the event; if they triggered the event on two different days within their Cohort period, they contribute to the Converted Users on each of those days.

Converted Users Size

This metric states the cumulative total of unique converted users, so it gives the count of users in the cohort that have triggered the event at least once. Each user contributes to the Converted Users Size from the day-after-install of their first conversion onwards.

Tracker \ DAI 0 1 2 3 4 5 6 7
Last week’s users 2 2 2 2 1 1 1 0

Converted Users Size per tracker

What we have here is Bob’s contribution on Day 0 after-install. After that, even though in our example there are no conversions on Day 1 after install, Bob is accounted for as a converted user from the previous day.

On Day 2 after-install, Alice became a converted user. We now have 2 converted users. You’ll recall from Part I that cohort size means the size on a particular day-after-install. Similarly, on Day 4 after-install, we again only have 1 converted user – Alice. Bob can’t contribute to Day 4 after-install anymore, because he only installed 3 days ago.

Like Cohort Size, the Converted Users Size remains constant for complete cohorts for every day-after-install: even a user who first converts on Day 6 after-install contributes to the Converted Users Size beginning on the install date. In the case of incomplete cohorts, Converted Users Size again behaves like Cohort Size: it can never grow, but only shrink with days-after-install – as the table above demonstrates.

Conversion Distribution

The Conversion Distribution is derived by dividing converted_users by converted_user_size. With this KPI, you can estimate the conversion probabilities for each day-after install. In simple terms, the Conversion Distribution tells us the following for each day-after-install: Given that a user triggers event X at some point, what is the likelihood that he does it on this day-after-install?

Our running example is too small to produce a meaningful Conversion Distribution, so let’s look at a sample Conversion Distribution table.

Tracker \ DAI 0 1 2 3 4 5 6 7
Last week’s users 79.66% 26.03% 14.79% 16.27% 11.39% 18.33% 2.38% 0.00%

Conversion Distribution per day-after-install

This table probably raises some questions. Firstly, we say distribution, but why is the sum for the entire row much more than 100%? Remember that the Converted Users metric can hold event-conversions for the same users on multiple days-after-install, but each user can only contribute to the Converted User Size metric once. To read the above table in a probabilistic sense we’d say: Given that a user completes a level at some point, the probability that they complete a level on the install day is 0.7966, on Day 1 after-install is 0.2603, and so forth.

Let’s also note that this KPI doesn’t have a fixed behavior. For example, it’s not guaranteed that the value on the install day will be 100%, or even close. The reason for this is the above-mentioned behavior of the Converted User Size. Generally, there won’t be expectations either as to where the maximum or minimum probability values would be found in the row – we can’t predict anything here, really, because the Conversion Distribution will depend crucially on the specific event and the concrete type of app. Level completion could peak in the first day, but other conversion events such as purchase of coins will only be triggered by highly engaged users, leading to lower initial Conversion Distribution figures.

Conversion Per User

Remember Retention Rate? Conversion Per User is the event conversion rate for the cohort, derived as converted_users divided by cohort_size. Let’s say we’re looking at Conversion Per User for the event completed a level on Day 2 after-install. We can read the metric as the ratio between the number of users who completed a level on the second day after they installed the app versus the total number of users who had the app installed for at least 2 days ago.

Here’s the table for our running example:

Tracker \ DAI 0 1 2 3 4 5 6 7
Last week’s users 0.33 0 1 0 1 0 0 0

Conversion Per User per day-after-install

For the install day, we get a Conversion Per User of 0.33, because only one of the three users completed a level. On Day 1 after install, no user completed a level. On Day 2 after-install, both users in the cohort completed at least one level. The Conversion Per User then yields 1, or 100%: every user that had a chance to complete a level, by virtue of having the app for at least two days, did so.

Conversion Per Active User

This metric differs from the Conversion Per User only in the denominator: Conversion Per Active User is derived by dividing converted_users by retained_users. At Day 2 after-install, you’d have the ratio between the number of users who triggered the event 2 days after they installed the app, instead of the number of users who returned to the app 2 days after they installed it. For the current example, the table would change to include certain na values for after-install days on which no user returned to the app, such as on Day 5 after-install.


Unlike conversions, which measure whether a user triggered an event on a given day or not, events can be triggered multiple times per day and per session. The events metric simply shows us the absolute number of events triggered in a given period. Using the adjust Pull API, you could base the analysis on a single event, a few select ones, or all of the events you have created for your app.

Just as with all other KPIs, you can choose to calculate the Events metrics with install-date segmentation, or with tracker segmentation.

Let’s work with the cohort parameters from the running example, but instead of only three users, let’s make it a bit more interesting by adding quite a few more.

DOW \ DAI 0 1 2 3 4 5 6 7
Mon 112 98 107 88 92 43 41 78
Tue 129 139 111 117 95 77 71 na
Wed 148 77 91 101 99 69 na na
Thu 78 94 51 61 49 na na na
Fri 41 13 13 89 na na na na
Sat 12 17 45 na na na na na
Sun 13 34 na na na na na na

Events per day-after-install by install-date

Now, these numbers look a little more realistic.

This table depicts the total sum of events triggered by users in the cohort for each day-after-install. Let’s also assume these installs came from more than one tracker – then, we can look at the tracker segmentation for the Events KPI.

Tracker \ DAI 0 1 2 3 4 5 6 7
Network1 123 101 89 55 31 19 7 3
Network2 410 371 329 401 304 170 105 75

Events per day-after-install by tracker

Note that the totals per column in the above two tables will always be the same. This is because the two tables simply show different segmentations of the same data.

Events Per Active User

Events Per Active User is calculated by dividing events by retained users. This metric tells us how many times a user triggered the event on average, given that they opened the app on that day-after-install. Events Per Active User is useful for revealing the peaks and drop-offs of active users’ engagement measured by the event in question. Let’s make up some data for our model cohort for the event completed a level. The following table shows us how the metric tells us how many levels an active user completed on average on each day-after-install.

Tracker \ DAI 0 1 2 3 4 5 6 7
Events 14 0 15 0 3 0 0 0
Retained Users 3 1 2 0 1 0 0 0
Events per Active User 4.67 0 7.5 na 3 na na na

Events Per Active User per day-after-install

As the table shows, our Active Users completed 4.67 levels on the day of install, on average, and the most levels are reached, on average, on Day 2 after install. Note that we see na values for the days where no users were active in the app. Finally, we see that with Day 4 after-install, the users do not complete as many levels as before. To retain our active users better, we might consider offering them a free pass on a level once they have played two levels, or to increase the game’s difficulty at a slower pace.

Events Per Converted User

You can calculate Events Per Converted User through dividing events by converted_users. This is similar to the previous KPI in that it averages over a subset of our users’ behavior, but the condition on which a user is included in the analysis is not that they merely opened the app. Rather, only users who converted on the specified event at least once are part of the group. Let’s consider our completed a level example from above. Level completion is now both the conversion, and the event counted. Take a look at the Events Per Converted Users table:

Tracker \ DAI 0 1 2 3 4 5 6 7
Events 14 0 15 0 3 0 0 0
Converted Users 1 0 2 0 1 0 0 0
Events per Converted User 14 na 7.5 na 3 na na na

Events Per Converted User per day-after-install

As before, we see that the average number of completed levels dropped from 7.5 on Day 2 after-install down to only 3 on Day 4 after-install. But this table shows us that the average of 7.5 level completions on Day 2 does not stem from an increase in levels played compared to the install day. Rather, those users that complete levels at all complete seven to eight levels on the install day as well as on Day 2 after-install. This suggests that we really should test why this number is dropping on Day 4 after-install.

Events Per User

Finally, let’s look at Events Per User, which is derived by dividing events by cohort size. As you see, this metric divides the included Events by the Cohort Size. This KPI is most useful if you want to find the average activity of acquired users, for example if you can associate them directly with revenue.

And voilà, you’re an expert!

You’ve made it through the first three articles on cohorted KPIs and now know all you need to know when working with event-based KPIs. You’re ready to query KPIs from us to analyze your app performance. You can leverage these KPIs to understand how your users advance through your app, and to find out where and why they’re dropping off.

Remember to keep exploring the possibilities offered by our Cohort reporting tool, and feel free to ping us with any questions.

Be the first to know. Subscribe for monthly app insights.