How to Use A/B Testing to Increase Tap-to-Install Rate
Just a few adequate keywords and overall quality of the user experience could get your application to the top not so long ago. Alas, this approach will do you no good today. The saturation of the apps market encourages mobile publishers all over the world to find new methods and tactics for the enhancement of their app store optimization strategy.
Unfortunately, lots of mobile marketers still believe that app store optimization (ASO) is only about keywords refinement. Indeed, smart keywords choice will favour your app’s discoverability, but it doesn’t guarantee further installs.
Once users get to your product page on an app store, such elements as screenshots, video previews, icons, etc. start to play a key role in the decision-making process of potential users. If you want to increase the tap-to-install rate of your store page, A/B testing can hasten and facilitate the process.
A/B testing basics
A/B or Split-testing is a method of hypothesis checking via running experiments in which two versions of the same product page are compared. Such tests help to determine the best performing variations.
If you want run impactful A/B experiments, make sure to include the following steps in your split-testing strategy:
- Analysis and brainstorming: there is no use running an A/B experiment unless you have a solid hypothesis worth proving. Market analysis can help you to come up with ideas for testing.
- Designing of variations: the layouts of your product page elements should reflect the above mentioned hypotheses.
- A/B experiment itself: you should ensure that traffic is divided equally between variations.That’s why it’s recommended to use Google Play Experiments or automated platforms like SplitMetrics.
- Results evaluation: on this step, you find out whether your original presumptions were right or wrong;
- The implementation of the results and follow-up experiments: A/B testing can make a real difference only if you turn it into an going process. Remember that there’s always a room for improvement.
Now let’s dwell on how thoughtful A/B testing can take your app’s performance to a whole new level.
Trusting your instincts may be beneficial for some fields, but nothing beats statistically significant results when it comes to app store optimization. You may love your redesigned app store screenshots set, but this sentiment doesn’t guarantee its great performance in the app stores nor the increase of the tap-to-install rate.
The app Hobnob experienced a serious conversion fall after the rebranding of its icon. The Hobnob team had to take actions to remedy the situation, A/B experiments came to their aid. A series of split-tests identified the icon that performed 64% better than the rebranded one. Once the new optimized icon was uploaded to the App Store, the conversion showed considerable growth.
That’s why it’s highly recommended to launch split-tests before changing any element of your store page. A/B tests not only help eliminate bias and human errors, they also give trustworthy results as they rest on mathematical and statistical principles.
The Head of Mobile marketing at Wargaming admitted that his team managed to cut out unnecessary office debates thanks to split-testing. So, if you start A/B testing, leave ‘whose opinion is more important’ dilemma behind.
App store product page empowerment
Our latest study of users behaviour in the App Store iOS 11 showed that an average page scroll depth is only 30%. It means that app store visitors decide whether to install your app or not while inspecting your screenshots. Even if a user starts reading your app’s short description, the chances that they’ll tap on the ‘Read more’ button are fairly low. In fact, only 2% of app store visitors read full description.
That’s why you should focus on perfecting your icon, subtitle and screenshots, if you want to maximize the conversion of your product page. A/B testing is the easiest and the most efficient way of polishing these page elements.
It’s vital to understand that even the smallest details may turn out to be game changers, but everything needs testing. The developers at Empire City Casino ran a series of split-experiments testing little changes in screenshots background and captions. As a result, the app got a new powerful set which generated 31% more installs.
If you want to reap the maximum benefits from A/B testing your product page, you must follow these golden rules:
- Your variations should look like store pages to ensure natural behaviour of users,
- Only one hypothesis should be tested within one experiment.
- Statistically significant results can be attained only if you fill the test with enough traffic (this amount varies according to app’s original conversion and traffic quality). The principle is quite straightforward: the better conversion, the less users you need for your experiment.
- It’s a good idea to run an A/B experiment for a week to track users behaviour on all weekdays.
Assessment of ad channels and target audience
In general, A/B experiments have the potential to fortify your marketing efforts. Above all, such tests help publishers identify the most performant marketing channels. Lots of mobile marketers use A/B testing to qualify various ad channels like news, Facebook, cross promo, and different advertising networks.
Analysing such experimental ad campaigns helps you determine the highest performing marketing channels and identify where your most relevant and loyal customers are coming from. Therefore, you can take advantage of these insights in your future marketing campaigns.
You can pinpoint your targeting users applying the same strategy. All it takes is split-test your store product pages on various age groups, sexes, locations, etc. For instance, one app for photo book creation killed two birds with one stone running A/B tests for their screenshots. They were able to increase the conversion rate of their app, but also perfected their targeting.
They learned that their application should be targeted at 25+ people who are interested in digital photography, photo sharing, photograph albums, VSCO, wedding and travel photography. The team also found out that such interests as a ‘canvas print’, ‘photo book’, and ‘canvas printers online’ were not that effective.
It goes without saying that a clear portrait of your ideal customers is essential for the effectiveness of any ASO activity. Let’s take app localization. It’s nearly impossible to adapt your product page for representatives of other cultures without careful A/B testing.
For example, FiftyThree took time to localize the store page of their app “Paper” for Chinese-speaking audience. They ran a series of split-tests to find out that the brown background they used on their product pages didn’t trigger any engagement from the Chinese user base. However, the set of multicolored screenshots resulted in 33% increase of their tap-to-install rate.
Putting it all together
All things considered, A/B testing is a powerful method that solve lots of problems of mobile publishers: from boosting conversion and developing the right positioning to competitive research and getting valuable insights into user behavior in the app stores.
A/B experiments require precision and dedication there is no denying, but accurate results, time-saving, and tap-to-install rate maximization are definitely worth the pain.
Guest Post Author: Liza Knotko
Liza Knotko is the Сontent Marketing Manager at SplitMetrics, an app A/B testing tool trusted by Rovio, MSQRD, Prisma and Zeptolab. SplitMetrics helps mobile app developers hack app store optimization by running A/B experiments and analyzing results.