The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.
Lately we released a Halloween update Kickerinho. It included 7 themed costumes: a zombie, mummy, skeleton etc. Not a very big update, but it added some new, fresh content. You can check out the promo trailer here to get a better idea.
We were quite surprised to see that the update had no positive effect on the download rate. Quite the opposite actually, we noted a substantial decline.
You need to know that we have a tradition of changing the game icon with content updates. It’s a cool strategy that many publishers incorporate. This helps our users notice that something new has arrived. It’s an impulse that could potentially retain churned players. Seems reasonable, right? What could go wrong?
We did some investigating and came to the conclusion that the icon we chose for the Halloween update was to blame. Why? Take a look at the downloads graph:
During the Halloween update we had 42% less downloads (compared to the corresponding period). Once the Halloween had ended we reverted the icon to the previous one (marked on the graph). As you can see, shortly after that, the download rate increased and went back to almost the same levels as before.
Those were the two icons that made such a difference:
Lesson learned? A/B test your icon. Always. No excuses.
Happily – it’s very easy to do so thanks to a feature introduced in Google Play Developer console: the experiments. Google lets you show a test icon to a portion of traffic visiting your store page and then compares the conversion rates. You can also test description, screenshots, promo graphics. Pretty cool, no?
We used it to test our winter icon. Those were the icons we wanted to test:
And the results:
Our “gut feeling” went towards the version A, but as it appeared all of them were worse than the “default” one. It was also clear that icons with sunglasses were better than those without them. So we made another test.
D was similar to B – which was the winner of previous test, but with a different scarf color. E on the other hand had the scarf removed, because the current (at that time) version was also without a scarf and it performed better than any of the three test icons.
Version E appeared to be the winner. Do note, that we used a smaller sample this time (hence the ranges are wider compared to previous test) but despite that it was clear which version should be chosen.
I encourage you to A/B test your icons. You can never be sure what your users will like and gut feelings can point you in the wrong directions. Ignore it, like we did with the Halloween update, and you could face some severe consequences. Since it’s so easy to A/B test using the experiments feature, there is no excuse not to do so!