Our Head of Game Data, Chris, shares high-level ideas about how to structure game development around data.
This year, we are taking Idle Miner Tycoon to the next level, unlocking the game’s full potential, with an initiative we call Data2020.
Want to know the full story behind the project? Check out this blog post by our CTO Oliver.
Data2020 is based around the idea that we need to become data-driven for further growth. Simply look around the industry and you’ll see proof of that idea everywhere – most F2P gaming companies are heavily data-driven (and even major AAA companies too!).
Data is the bread and butter of Games-as-a-Service (GaaS) and the simple fact of the matter is the following: Data allows you to focus on changes that matter. As was posited by Andrew McAfee of MIT – data and algorithms have a tendency to outperform human intuition at scale.
At Kolibri Games, we have reached a point where scale is ideal, and to drive growth as a company we need to grow at scale. Therefore, I’d like to discuss some high-level ideas about how to structure game development around data. Here is a 5 step process for data-driven decision making that can be used for most – if not all – decisions made on product teams.
I will provide mostly fictional examples, based on my experience as well as from our work on Idle Miner Tycoon.
Step 1: Pick the Right Metrics
Measure What Matters – as is described by the book of the same name by John Doerr – is the idea at the foundation of being data-driven. You need to pick the right metrics to measure success by, for each action you or your team takes. This comes with a secondary condition – always measure your actions and always set goals!
For example:
- We want to make a codebase change to make loading times faster. Therefore, we have to measure loading times! They should go down!
- We want to change the prices of offers to encourage repeat purchases – We need to measure the frequency of second, third and fourth purchases.
Further, this is a point for you to decide if you need an “analysis” or an “experiment”. Analytics provide data on what is happening in a business, whereas experiments actively test out different approaches with different consumer or employee segments and measure the difference in response.
Step 2: Set a Realistic Goal
Now that you know what you’re measuring, you should set your goal. This goal can be set with intuition but is often better designed with data. The goal should be written in the context of your metric that you picked beforehand!
For example:
- We want to make a codebase change to make loading times faster – our goal: loading times should be reduced by 15%.
- We want to change the prices of offers to encourage repeat purchases – our goal: we need to double the number of second purchases and improve third purchases by 50%.
Step 3: Execute the Change
Now that you’ve set your goal, you can start designing your strategy to accomplish it. Data can help here but it can only tell you two things: What is happening and where. It cannot tell you how or why (it will just point you in the right direction). So what is important here is simple logic – design a feature with good reasoning on how it impacts the “what” and the “where” with a good “how”.
For example:
- We see our loading times spike when players enter mines and this correlates with texture size. We, therefore, introduce texture compression which should result in faster load times, a smaller memory footprint, and dramatically increased rendering performance.
- We often push a discounted offer to players that we designed to introduce players to IAPs and what value they can bring, but we see not enough people are buying anything afterwards. This is usually because the price and the content are too generous (it needs to be a taster – just enough for players to get a taste but not enough to be very happy with it) so we need to reduce the amount of stuff in the offer relative to the price. In this case, I would lower both the price and the content (a 99c “sampling” offer).
Step 4: Measure the Impact
Once you have implemented the change, and because you set the goal earlier, you have already generated your success criteria. If you decided that analysis is enough, you can then just look at the metric you chose and make a decision: did the change work? Did it not work? If you decided you needed an “experiment” you do your analysis of your experiment here either through A/B testing or Simple Group Testing.
You can also do a data deep dive to understand the true impact. This is less important once you have determined if it worked – from here you can usually move on to step 5. But this can be quite important (and very useful) in the cases where it DID NOT work. But what’s important here is that data can only tell you what happened and when it happened, but not why – It’s therefore important to just dive as deep as you need to go to understand those two things.
For example:
- Introducing compression reduced loading times by 35%! Great success compared to our 15%!
- We created a new replacement “conversion offer” that was both lower value and cheaper (to 99c). But we still didn’t increase the rate of second purchase enough! Since Apple doesn’t allow prices lower than 99c and the value is already quite low, we probably need to look elsewhere as it is probably not the offer construction but something in the economy after users buy the offer.
Step 5: Learn
Taking a page right out of Agile Methodology – it’s important to not only measure the impact but to actually look at it retrospectively and summarize what you learned.
In literal terms: do a quick retro! Did it work? Did it not work? Why did it do that? What could be done better? Since you have evidence of whether it worked or didn’t work, and additional related data to help you understand why – we should learn with greater certainty and speed in the long run.
For example:
- Texture compression worked really well! We should probably always do this for every game we make!
- Players are really good at gamifying economies. We can probably make the economy really harsh and use data to tune it easier later on. This is often harder to execute the other way around.
And that’s it: Choosing metrics, setting goals, executing change, measuring impact and learning from it – these are the steps we take to make data an integral part of our decision-making process, turning a game for millions into a game for billions.
Does this way of working speak to you? Check out our open Data Team positions below!