Mobile video advertising: How to leverage your user data with Vungle
Product Content Strategist
Posted May 30, 2017
Of all the hours that people spend on their mobile phones per day, over 90 percent of that time is spent in apps. For app marketers, this represents a massive increase in the amount of data available about user behavior. One of Adjust’s integration engineers, Rina Matsumoto recently sat down with Maurice Tasker, Global Head of Performance Optimization at Vungle - the leading in-app video platform for performance marketers - to talk about how to leverage all of that data. From calculating lifetime value (LTV) to the effects of eliminating fraud from your dataset, here are some of the highlights of their conversation to help you raise the bar of your mobile video ads. To watch their entire presentation, head over to our webinars page, where you can also see their accompanying slides.
Rina: Why is it important for Vungle to do LTV optimization?
Maurice: At Vungle, we like to be proactive, so we want developers to hit their goals, whether it's scale or retention or ROAS. We really want to make sure that they see return. Some buyers have very intelligent BI systems in addition to the Adjust reporting, but even then some of those tools can lack vital pieces of Vungle-specific information that really no one but Vungle has access to.
So for those of you that buy from Vungle, you'll know that we allocate views to developers or buyers based on a combination of the cost per install that the developer's paying, and also the conversion rate or average conversion rate of their campaigns. That's basically an effective CPM. Let's imagine a developer has identified a single publisher who provides quality installs. They could do that through the Adjust reporting dashboard.
You can imagine that you've kind of identified this source, and you've set a static CPI based on the average quality that you seem or the average return that you see. Now [pointing to the slide], the plot on the right is, it basically shows why this might not be the best solution. But you wouldn't know that. It shows the average conversion rate, that's the gray line, and the average seven-day return on ad spend, so the amount of revenue you get within seven days of install. Those are the green bars. That's a function of the daily view number within this publisher. The daily view number is how many views a user has seen within one day. You can essentially see that the fewer views a user has seen, the more likely they are to convert, the more likely they to download the app that's being advertised, but also the more likely they are to provide a higher return.
Rina: On Adjust we send a bunch of data to your servers. Could you talk a little bit more about what you do with those data points?
Maurice: The first thing we do is ingest that [the data sent by Adjust] and process that information. We link it back to the original install, and therefore the original view, the original session, which essentially layers a lot of information on top of that single event. The next thing we'll do is we'll want to understand what are the client's KPIs or goals. Maybe it's a five percent seven-day return on ad spend, maybe it's a 20 percent day three retention. It varies pretty wildly. But it allows us to know what we should be aiming for. Once we have the client goals or KPIs, we're free to try and make sure that they're hitting those goals and maximizing their install value.
We can do that one of two ways, or using both. Either we can use the automated bidder tool (that's a kind of low-touch tool that's built by our data science team) based on historical data, to try and choose the best CPI for a certain cohort of users based off the KPI or goal that the client was given. Alternatively, we can do a fully managed service. That's where my team come in. A global team of experienced ‘optimization managers’ come in, understanding how they can change the campaign structures, what publishers they should move, and what bits they should change in order to reach the same goal.
The next step is kind of separate to that, and it's creative optimization. We have an in-house creative team based in London. They create a lot of templates that we can use for all of our different advertisers. What we can use the post-install data for is to really understand that creative, not only on a conversion-rate level, but also whether it's provided back on a kind of ROAS level, or retention level. Finally, we produce reporting. It's quite similar to Adjust, really, but maybe a little more tailored.
Rina: So you mentioned earlier about how clients set different KPIs. Could you talk a little bit more about what different targets you typically see from advertisers?
Maurice: Sure thing. It really depends on how they're developed for monetizers. We work with a range of different developers, from pure brands to some gaming specific, or brand performance, lots of different types of applications. But the goals depend very much on how the app monetizes.
The majority of developers are probably looking at a return on ad spend. To make that a sensible metric for performance, a large portion of your revenue stream has to come through some kind of revenue event, typically in that purchase. Alternatively there are some developers who look more toward retention, and that might be because they have goals of building a community in their app, or alternatively they monetize a lot through advertisement, so they need people in their application to kind of feed their revenue stream. And then there's kind of a different portion, which is more typical for the performance brands as we see them. That's where an in-app purchase might not best describe the way that they make revenue. Instead, may be a cost per action, cost per active user, or an event completion, something like that, might make more sense. A good example is an app that's a marketplace, so it allows buying, selling between individuals. They might monetize through taking a percentage of that buying and selling interaction. Therefore, looking at any single purchase doesn't really make sense, because they can vary wildly in price, whereas seeing if someone makes a purchase or makes that purchase could be a better signal.
Rina: Our next topic is view-through attribution. For those of you who aren't familiar with view-through attribution, I'll talk briefly about how it works. Traditionally, the ad that drove the last click for our user before they install the app is the ad that the install is attributed to. Here at adjust, we also have an option for advertisers to not only treat installs based on last click, but also by last view.
Maurice, maybe we can talk about how you and Vungle view view-through attribution?
Maurice: Historically, as you've said, the discussions that I've had around user attribution are around whether a view signifies enough of an engagement to warrant an install being attributed to it. Now, for typical kind of banner advertising, I think that's very questionable indeed, especially on a mobile device where the screen is a lot smaller. I can't see how a view would signify an engagement. Now, for mobile video advertising, I would argue it's slightly different, especially for rewarded video. I think it's much harder to say there was no engagement at all, there. However, I think it is something that buyers should be extremely of and treat with care. Actually, the more organics that you have, or the larger in the ecosystem you are, the more susceptible you are to paying for organics. If view-through is misused. Typically the view-through window is 24 hours. Now, I think any further than that, and we've examined it at Vungle, then you are pretty much just paying for organic users. So as long as you use the attribution sensibly, and you trust the partner, the video partner in this case, and you know that the ad experience is an engaging one, then I think it can be a very powerful tool. But it is something I would encourage all developers to treat very, very carefully.
Rina: Maurice, it would be awesome if you could talk a little bit about your experience with Adjust's fraud prevention suite, and how that may have affected your relationship with advertisers.
Maurice: This is quite an exciting one for us. Given that we have kind of direct SDK relationships with all of our publishers, relatively we see very little fraud. However, that very small amount can lead to a lot of communication with our developers. A lot of back and forth, it can affect even things up to kind of invoicing, financing, cashflow, things like that. But the great thing about the Adjust tool is that really, it's changed very little in terms of the relationship that we have with our advertisers because it just happens in the background. That's predominantly because we don't get install callbacks that are fraudulent. Anything deemed fraudulent, we don't receive it as a callback, and therefore we don't pay a fraudulent publisher mistakenly via that install. Really, it cuts off fraud at the source. Actually, this is my sort of hope on hope, if everyone in the marketplace as a developer used this tool, then I think mobile ad fraud would pretty much be obliterated at the source. Because what these fraudulent publishers rely on is that things slipping through the cracks and not being caught by either the platform or the attribution buyer.
Ready to brush up on your fraud knowledge? Our mobile fraud guide walks you through the most common types of fraud and how you can fight them. You can get it by clicking here.