How to do Mobile App A/B Testing for Your App Store Listing
Have you ever heard the saying "App Store Optimization is an ongoing process?" Well, it is indeed accurate. After optimizing your app store listing, and getting your app in the first high-ranking positions, your work isn't done. And one of your best friends during every ASO work that you do is called mobile app A/B testing.
What is Mobile App A/B Testing?
A/B testing is the process of creating variations from an asset of your app store listing, split your audience into two groups, and see which variation brought you more users, visitors, or whatever KPI you were looking for. Bear in mind that mobile app A/B testing is when you experiment with two variations, while split testing is when you test more than two.
Mobile A/B testing is a simple task that involves a lot of work, but trust me, it’s worth it.
Let’s start from the beginning. After creating your first app store listing, you need to continue improving and optimizing it. It’s difficult to estimate if changing your app icon or adding a video or more screenshots will increase or decrease your downloads. In order to not risk the progress you’ve achieved till that point, mobile app A/B testing helps you out.
When you run a mobile A/B test, you are running an experiment. You keep your current version of the app store listing and create another variation to test one specific aspect. (We will get to the aspects later). Then, your existing app product page will show up to one group of people and the new variation to another. Remember that both groups have to be the same target audience. Otherwise, the results won’t be accurate. After running the experiment for a certain period of time, you will get a clear answer about the changes. Do they have a better performance, or should you stick to what you have and try another idea?
In a nutshell, that’s how mobile A/B testing works. Now, let’s understand more about the different phases of your experiment and the different types of mobile app A/B testing.
Mobile A/B Testing: Step-by-Step
There are several phases in the mobile app A/B testing experiment that you must go through. So, take your time to work on each one of them. Now, let’s take a look at them and find out how they work.
Part I: Research
This is the part where you will find out what is missing in your app store listing or what could be improved. If you have worked a long time to have a great product page, it will be hard to see what can be developed; that’s why the research phase is important.
So, where should you look for inspiration?
- Keep an eye on your competitors. Spying on your competitors is a helpful resource for inspiration. Look for your successful competitors; what is great about their store listing? What would you like to implement in your store listing? Try to understand how often they make changes to their product page and what type of changes they are making. With App Radar, you are able to track your competitor’s rankings and follow their app store listing updates with the App Timeline. You can create a free account here and try this feature in our 7-day free trial.
- Have an overview of your category. Try to broaden your research and check the apps in your category. By doing this, you will probably find some ideas that can make your app stand out from the competition yet still following your category standard.
- Think outside of the box. Don’t focus only on the app stores. Remember to research other types of media that your target audience consumes. This is a great way to find inspiration that communicates directly with your potential user. For example, the Space Sheep Games team was looking for new screenshots for their game, and they got inspired by the MTV reality show “Ex on The Beach.” When they used it as an inspiration to create new screenshots, the experiment showed them that they could increase conversions between 45% to 73%.
Tip! In this phase, you may also use mobile A/B test to figure out which ad channels bring you the best traffic. So, you can create campaigns in different networks with the same target audience. This way, you can find out the ad channels which bring you the best converting visitors, and so, spend your ad budget wisely.
Part II: Hypothesis
Now it’s time to create your hypothesis which will be the basis of your experiment. Answer these two questions:
- What do you want to change in your store listing?
- What do you want to achieve with these changes?
If we use the Space Sheep Games example, you could come up with the following hypothesis:
When talking about creatives, you can focus on two types of experiments:
- App Store Listing. The main purpose of this experiment is to convert your product page visitors. This helps you to have your store listing converting well, which will be excellent for your app store advertising – your ads will attract potential users to high converting product pages.
- Search results. The main purpose of this experiment is to stand out from the competition, meaning to turn impressions from the search results into product page visitors.
Remember! 1) Don’t test more than one hypothesis simultaneously; 2) test with variations that significantly differ; 3) too small details might be too less to give you a precise result. 4) it isn’t only about adding or editing elements – sometimes it’s also about removing them!
Part III: Traffic
When running a mobile app A/B test or a split test, you need to have enough traffic for a realistic result. This means that you need to get considerable traffic in every variation to be able to judge which variation is the most successful one.
When you get enough traffic, you reach a realistic “Confidence Level.” The Confidence Level is calculated by the a/b testing tool and shows you, based on the traffic sample, how accurate the result of the experiment is. For example, if your variation A result showed a 25 to 45% increase in conversions, and its confidence level is 95%, this means that this result will prove to be true 95 times out of 100.
If your experiment results show you a slight difference between the variations, you need more traffic volume in order to make a decision.
Part IV: Creating your Variation
Now, it’s the time when you’ll create variations based on your hypothesis. When you are creating your variations, always bear in mind that you need to stand out from your competitors. Going back to the Space Sheep example: they would now design their screenshots using the “Ex on the Beach” elements.
In case you don’t have in-house designers and your focus is on screenshots, you can find here a list of tools to design app screenshots.
Part V: Running the Experiments
One rule about mobile A/B testing must be respected: never finish your experiment before 7 days. This is really important because you will leave your experiment on during the weekdays and at the weekend. Ideally, mobile A/B tests run between 7 to 14 days.
Tip! Even if you already think you have enough data to make a decision, leave it on for at least 7 days.
Part VI: Analysis and Implementation
Now it’s time to collect data and make a decision. In addition to the Confidence Level, you need to focus on some KPIs to evaluate the results.
- Traffic Volume per Variation: if you don’t have a similar number or if you have a tiny number of visitors, your result won’t be accurate enough.
- Conversion Number Per Variation: how many visitors actually tapped on the download button for each variation?
- Time to Click to Install: for each variation, how long did it take for the visitor to tap on the install button?
- Engagement Rate in the Store Listing: how many visitors scrolled through your screenshots, watched your video, read your description, and tapped on the install button?
- Improvement Metric: this metric shows you if the variation increased or decreased your conversion rate.
- Retention Rate: this metric is about the quality of the traffic you attracted with the chosen variation. It’s important to observe if you are attracting people that install and use your app or people who install and churn right after. Is the new variation bringing the same quality traffic to your mobile app?
If the result shows that your new variation wins by less than 2%, think twice before updating your store listing. To see a real impact, you need an advantage of at least 3% on the winning variation.
Remember! The result may show you that your hypothesis was wrong, and that’s okay; that’s why you are experimenting. The idea to do an A/B test is to avoid mistakes that would bring your conversions down!
A/B Testing for Different Locales
Your testing results for Spain won’t be the same as for United Kingdom. You need to consider A/B test per locale in order to optimize and increase conversion for different locales.
iOS A/B testing and Google Play A/B Testing
Besides, don’t use your mobile A/B testing results from the Apple App Store in Google Play Store. User behavior is different in each store, and also the layout of both stores is different. In order to have accurate results, you should test for each platform separately.
It’s possible to test mobile app elements by using Google Play Store listing experiments feature. You just need to choose your app and go to the left side menu under. Click on Grow; then on “Store Listing Experiments.” Now it’s also possible to do an A/B test in the Apple App Store which is a brand new feature.
Now that you have a winner, it’s time to implement your results and track your performance. Just like App Store Optimization, this isn’t something that you see the effects overnight. Wait for one to two months to see the full improvement.
Part VII: Follow Up
As we mentioned at the beginning of this article, App Store Optimization is an ongoing process, and so is A/B testing. So let’s start preparing for your next experiment! There are always things to test and things to improve as the environment constantly changes. The app stores are changing their rules, thousands of new apps are added daily, and some competitors fade away. So, you need to keep updated to not lose your market share! So implement mobile app A/B testing in your App Store Optimization strategy.
Which Elements to A/B Test?
As you can imagine, the most influential elements on an app store listing are the app icon, screenshots, and video. SplitMetrics states that only 2% of app store visitors read the full app description. Hence, if you want to work on that, we would recommend focusing on your short description that can actually impact conversion rates.
A/B Testing App Icon
This is an element that will help you more when you want to do app a/b testing on search results. App icon plays a big role in the decision of a potential user to visit your app store listing. The app icon is the first impression of your app, and that’s why it is such an essential element to be tested.
Be sure to create an icon that stands out. You can read more about how to create a converting app icon here. To start your app icon experiment, you should research what your competitors are doing to understand best practices. For example, many mobile games have app icons featuring characters with angry faces or with their mouth open. You can also test the same.
Here are some things that you can a/b test in your app icon:
- A female or a male character;
- Background color;
- Logo or no logo;
- Facial expressions of character;
- Elements that represent your USP.
A/B Testing App Screenshots
Screenshots on Apple App Store and Google Play Store give you a big experimenting room. You can either use it for testing search results or your app store listing. Here are some examples of what you can test on screenshots:
- Orientation: You can experiment which one works better for your app: landscape or portrait. SplitMetrics research showed that 15% of visitors look at all 5 screenshots with landscape portraits, and 18% of downloads happen after checking out the fifth. On the other hand, only 11% of visitors look at all 5 screenshots, and only 13% of installs are made after checking out the fifth screenshot. So, you can figure out if this is the same for you or not.
- Colors and background: Test various colors but always stay true to your branding!
- Caption: You can either try a long caption, a keyword, or no caption at all. Research from Incipia showed that it didn’t make a big difference what kind of caption you used, but to have one is better than not having any. Does it also work for you? Let’s do an A/B test to find out!
- Shuffle: Which screenshot should be the first one? A/B test to find it out.
- Social Proof: Using testimonials or prizes your app received on your screenshots could be a nice way to test if this brings more downloads or not.
A/B Testing App Video
App videos can improve conversions, but not for everyone and not any type of video. Hence, including video on your app store listing is a great mobile app a/b test idea. Should you have a video on your product page or not?
When testing a video, bear in mind that you shouldn’t repeat what is already in your screenshots. And, in addition, you shouldn’t show all your features in it. People won’t watch more than 20 seconds of your video, so use it to show your USP. You can A/B test by changing:
- Features – which feature should you show in your video?
- Colors
- Duration
- Having one video or more (only possible in the Apple App Store)
Then you can compare the number of app downloads after watching the video. Remember that in the Apple App Store, videos start automatically, so you shouldn’t fully trust this metric.
Is it only possible to test the app store product page?
And the answer is, no! Check out these other two types of testing:
Testing in Pre-Launch Phase
If you don’t have your app live yet, then there are tools that allow to test your product page even before releasing your app. This means that you can already test your product page with several hypotheses and see if it converts.
This is a great idea for:
- Finding your USP: If you don’t have a clear idea of which app feature is your unique selling point, you can use pre-launch phase to understand that. You can also do that with a short survey when people land on your store listing. This way, you are able to understand which features they are interested in the most.
- Find your target audience: See who gets interested in your app.
- Test your app store listing: Is it converting?
- Lead generation: You can create a mailing list. So, people that land on your pre-launch store listing can add their email address to get notified when your app is released.
- Localization: Find out where in the world there’s more interest in your app.
Testing In-App Experiments
The in-app experiment is another aspect of mobile A/B testing that is also very important. This way, you can test in order to provide the best app experience to users, making them come back and use your app frequently.
According to BuildFire, people use on average 10 apps per day, and you want to show them why you should be one of them. For testing in-app experiments, it’s good to have clear goals. This could be:
- Boost user retention: learn if the amount of engaged users changes when you add push notifications or change onboarding.
- Gather data from usage: learn how a person uses your app and what CTAs they tap on.
- Build new features: learn which features your users really want or the ones that you shouldn’t bother building.
And if you are wondering what exactly you can test about in-app experiments, then here are some tips:
- Onboarding: What’s the best way to welcome new users?
- UX: Is the user’s behavior meeting your expectations?
- CTAs: Are the users tapping on your CTAs?
- Subscription/ In-App Purchases: How to trigger monetization in a way that makes users pay?
- Feature Discovery: Where should you place features in order to make them more easily accessible?
Learnings about Mobile App A/B Testing
To have a successful App Store Optimization strategy, you should definitely include app a/b testing in your workflow. Here are your main takeaways on this topic:
- Mobile A/B testing answers what type of variations convert the most.
- Start your test by researching and learning about your app category and competitors. This way, you’ll have a clear understanding of best practices and where you can improve your store listing.
- There is no mobile app A/B testing without any hypothesis.
- The amount of traffic of your experiment is crucial for the reliability of your results.
- Don’t experiment with more than one aspect per test.
- Run your experiment for at least 7 days. Usually, a good mobile A/B test stays live between 7 to 14 days.
- Mind the Confidence Level to understand how accurate your results are, and don’t implement changes if your new variation wins with an advantage of less than 2%.
- Test as much as you can. There is always room for improvement.
- The key elements to test are: the app icon, screenshots, and app video.
- Consider testing before launching your app in order to validate your idea, and then release your app already having a good performing product page.
- Use in-app testing to boost retention rate, and to improve user experience in your app.