A/B Testing in ASO: A Guide for Beginners
App marketers often wonder how to drive more users to download their apps. This is where A/B testing comes in. As a marketing staple, A/B testing can potentially lead to double-digit uplifts in your conversion rate (impressions to downloads) when applied for ASO. In this blog, we’ll delve into what A/B testing is for ASO, how to set up your first A/B tests, and some common mistakes to avoid when setting your first tests.
What is A/B testing for ASO?
A/B testing for ASO is testing two (or more) variations of an element on your app’s page (e.g. alternate versions of your screenshots) to determine which one appeals most to store visitors. You can set the percentage of store traffic that will see a variant rather than the original element, but you can’t decide what type of profiles you want to include in this percentage. Comparing the results of the test, you can then determine which variant is most likely to increase your app installs.
The store traffic that can see your A/B tests includes:
- Visitors who find your app in the Browse/Explore tabs.
- Visitors who see your app when searching for a keyword your app ranks for.
- All visitors who land on your app page one way or another.
Why does A/B testing matter in ASO?
A/B testing allows you to make data-driven decisions to identify the ASO strategies that will increase your app’s conversion rate.
Moreover, A/B testing is an opportunity to observe how the traffic you receive behaves. It helps you better understand the expectations of your target market so you can adjust your app page accordingly. A variant that performed worse than the original version can tell you what doesn’t appeal to your audience. This is a valuable insight that can be leveraged in any of your future marketing efforts.
Yelp could run A/B tests to better understand which screenshots lead to a better conversion rate.
Preparing for your first A/B test
Conducting your first A/B test can be a daunting task. The key to success is good preparation; it’s essential to establish a solid hypothesis before designing your A/B tests.
There are four elements to a solid hypothesis for your A/B test:
- First, define the problem. The basis for an A/B test should be that you suspect an issue with your current app page (highlighted by data or users’ reviews) and uncertainty about how to fix this problem. If you’re certain of how to solve the problem, skip A/B testing.
- Second, determine what element you want to change on your app page. Which element do you think is at the root of the problem and could improve your app’s conversion rate if changed?
- Third, decide how you want to implement this change. Determine the exact change you want to make in the problematic element – for instance, by adding or removing an item within it.
- Finally, consider whether the change is visible enough to also change the behavior of store visitors. If you change something in a section of your app page that is seen by less than 5% of your store traffic, it’s very unlikely to have a big impact on your conversion rate.
Try to stick to one hypothesis and one variable (one element changed) per A/B test; this way, you can more easily measure the results of your test. If 5 variables are changed in one test, knowing which one caused a result – positive or negative – will be complicated.
Publishing your first A/B test (App Store and Google Play)
Once you have established your hypothesis, follow these simple steps:
- Head to your store console and find its A/B testing tab or page (the “Store Listing Experiments” page for Google Play or the “Product Page Optimization” tab for the App Store).
- Click on “Create a test/an experiment” and follow the instructions.
Source: App Store Connect
You’ll be asked to set up a few parameters before you can publish the test. Here are the most important ones:
- Choosing the traffic proportion (%) that will see one of your variants instead of the original page. At AppTweak, we recommend splitting traffic equally between the original page and the different variants to get the most accurate results.
- Estimating your test duration. This indicative setting allows you to estimate when you believe your A/B test will deliver conclusive results, so you understand whether your expectations were realistic or not. Reaching the end of the estimated test duration won’t end the A/B test.
- Selecting the assets (elements) you want to test. As previously explained, it’s better to focus on one element at a time to be able to better measure the impact of the test.
Depending on the A/B testing tool you use, you’ll have more or fewer options to set up your test. Here, we’ll focus on native A/B testing tools on both stores – store listing experiments for Google Play and product page optimization for the App Store.
Store listing experiments on Google Play
The Google Play Experiments engine is one of the best and most common A/B testing tools for ASO. Store listing experiments allow you to directly test different assets of your store listing with live store traffic. Google allows you to test most assets on your store listing – creatives (icon, promotional video, feature graphic, screenshots) – as well as your short and long descriptions. Only the title cannot be tested.
Up to 3 variants can be tested against the original version. You can only run one store listing experiment per app at one time, and up to five experiments if you’ve added localized graphic assets in specific languages. Store listing experiments can run indefinitely.
A/B testing on Google Play allows you to:
- Identify the most impactful elements on your app page.
- Learn what resonates (or not) with your target market, based on their language and locality.
- Potentially increase your app’s conversion rate thanks to the insights gained.
- Identify seasonality trends and takeaways that can be applied to your overall app page.
Source: Google Play Console
App Store product page optimization
Product page optimization (PPO) is a helpful tool for ASO practitioners to understand the impact of different page elements on iOS conversion rates. Apple rolled out PPO with iOS 15 and, as such, PPO variants are only shown to App Store users with iOS 15 or later.
Apple only allows you to test creative assets (icon, preview video, and screenshots) for up to 90 days with PPO. Up to 3 variants can be tested against the original version. You can only run one test per app at a time, but you can run localized tests for all the languages your app supports.
Learn how you can use product page optimization to optimize your App Store assets
Although store listing experiments and product page optimization are two of the most well-known options, third-party A/B testing tools also exist, including Geeklab, Storemaven, SplitMetrics, and Upptic. These often provide additional control and features for your tests.
Before starting your A/B test, think through what you want to test and why. For some, revamping screenshots will be the main priority. For others, whether to add an app preview video is the principal concern. Assess which elements of your brand or product matter most to your store traffic and how you can make your app stand out accordingly.
Mistakes to avoid when A/B testing for ASO
Make sure to avoid this list of mistakes to obtain conclusive results from your A/B tests.
Not running your A/B tests long enough
Many make the mistake of ending their A/B tests too early. Closing the test too early means less data collected and potentially erroneous results. Furthermore, store visitors behave differently during the week vs. on weekends, changing the data collected significantly.
To avoid such problems, run your A/B test for long enough: 7 days at a minimum, so that weekdays and weekends are included. If possible, run it for longer. At AppTweak, we recommend multiples of 7.
Expecting identical results for the same test on both stores
The appearance of app pages on Google Play differs from on the App Store, as does the behavior of store traffic. Therefore, it’s a mistake to believe that all findings observed on one store apply to both stores without considering differences in UI or traffic. To get reliable results, test the assets on both the App Store and Google Play unless you have a valid reason to believe that results would apply to both stores.
Not being aware of other marketing efforts & their impact
Some of your marketing actions – for example, paid campaigns – can change the behavior of your app store visitors. These visitors might find certain assets more (or less) appealing than your organic traffic, thus distorting your test results.
Spy on your competitors’ A/B tests with AppTweak
At AppTweak, we’ve developed a feature that allows you to spy on A/B tests performed by your competitors (available for Google Play). This can provide valuable information, such as how often your competitors run A/B tests and for how long, the elements of their app page they test the most often, and what changes they introduce in their tests.
Find out which metadata elements your competitors are A/B testing with AppTweak’s ASO Timeline
On AppTweak, you can track the A/B tests made by your competitors and whether the variant they tested was implemented or not.
Conclusion
A/B testing is a powerful marketing tool that can help to increase your app’s conversion rates. However, A/B tests require good preparation, including a well-thought-out hypothesis. If properly designed, an A/B test can provide valuable information about the behavior of your store traffic, no matter the outcome. This knowledge of your store traffic can then help you better direct your marketing efforts.
Start using AppTweak today to spy on your competitors’ A/B tests.