How App A/B Testing Works

Posted in June 2017 by Alexandra Lamachenka

App Analytics

App developers claim that App Store Optimization (ASO) that they knew several years ago has dramatically changed. It is no longer enough to put keywords in iTunes Keyword Field and use all of 5 spots reserved for screenshots to optimize an application. Instead, ASO stands for thoughtful element analysis, app reputation management and infinite process of testing and optimization.

App developers follow the new rules - and there’s no wonder, both vealy startups and flourishing enterprises aspire to make the most of their marketing budgets by optimizing product page elements.

Boosting conversion without extra traffic expenses in the world of ever-increasing cost of user acquisition sounds like a cherished dream, doesn’t it? Smart ASO can help to make it true.

Keywords optimization is usually the first thing that comes to marketers’ minds at mentioning ASO. Unfortunately, they tend to neglect such essential product page elements as screenshots, icon, description, name, video preview, etc.

Yet, app store page is the arena of final decision-making, it’s the place where users normally click “Install” in the end of the day. To maximize converting power of all store page elements you should master the art of mobile A/B testing.

Principles behind A/B testing

A/B testing or Split-testing is a method of comparing two hypotheses identifying which option performs better.

The basic idea is simple: you distribute an audience equally among two or more variations of an app product page element (screenshots, title, etc.). All audience members act naturally being unaware of the testing. You find out a version with the best conversion rate as a result.

A/B tests range of objectives

A/B testing’s paramount goal is app conversion rate improvement. However, its benefits move far beyond conversion optimization. Needless to say, it gives valuable insights for marketing activity reinforcement. Split-tests assist in evaluation of audience segments as well. Furthermore, they may be of service in measuring efficiency of new traffic channels.

Marketers can also use pre-launch experiments to qualify new ideas identifying ones that hit chord with their target audience. Moreover, publishers are able to find out whether product positioning change is needed and audience understands and fancies their new message.

Planning is an integral part of any activity, A/B testing is no exception.

Let’s analyze each step of the basic A/B testing strategy through the example of Prisma (2016’s App of the Year).

App A/B testing workflow

Prisma is an application that allows putting art-inspired filters on photos. An innovational technology that lays behind the app has drawn attention of media and ensured app’s viral growth.

When the traffic comes - and no matter it is 10 or 10K users per day - an app developer has to be sure each and every element on the user’s path to install works on increasing conversions. Ultimately, how well an app conversion rate influences not only a number of users, but also app page rankings.

That became a primary reason for optimizing Prisma’s app store page.

Any A/B tests starts with profound audience, traffic sources and market analysis

To reveal room for optimization, Prisma analyzed its competitors and identified 5 best practices:

  1. Competitors were using bolder font of captions.
  2. The most popular features were highlighted.
  3. Competitors were using bullet lists.
  4. Catchy press reviews were added in descriptions.
  5. Screenshots featured users’ comments and likes.

Given the fact that the majority of Prisma’s users were landing directly on the app page from media reviews and ads, the biggest challenge was to optimize an app on this exact step.

Although each page element influences a conversion rate, screenshots are the most impactful page elements when it comes to finding an app through Search and deciding on installing an app.

Hypothesis brainstorming follows analysis

No A/B test is a success without a solid hypothesis behind it. The ASO checklist can help you in establishing goals and specifying type of an experiment you want to run.

In screenshots experiments of Prisma several presumptions were tested:

  • Whether caption text with bolder font placed on the top of the images performs better.
  • If comments and likes make the image more dynamic.
  • Whether it’s better to use all five screenshots.

Once hypotheses are developed, it’s time for creating variations

It’s necessary to prepare variations design within this step. The layout should be based on the hypothesis under the test.

Here is Prisma’s control…

App A/B Testing Variant A

… and optimized variations.

App A/B Testing Variant B

Only when first 3 steps are completed, you run a Split-test

Two quick preparational steps before running an app A/B experiment are to define a preferable traffic source (this can be Facebook ads, Google Adwords, Banners, etc.) and appoint a target audience. You will be driving these users to your app’s product page variations and analyzing their behavior.

Mind that the audience is to be distributed equally. Prisma used SplitMetrics A/B testing platform that was automatically splitting potential users and landing them on two different variations.

When it came to the choice of traffic source, it was decided to use Facebook ads. First of all, the most converting banner was identified via running set of various images against each other within Facebook ads campaign. This test didn’t only determine the best performing ads banner for upcoming split-testing, ideas for screenshot layout were estimated as well.

The coolest part is analyzing app A/B testing results

Results are considered trustworthy only upon reaching statistically significant number of active users.

This number depends on your conversion rate: the higher conversion, the less users you need. At reaching necessary confidence level of the experiment, Prisma’s redesigned screenshots got 12.3% conversion uplift.

App A/B Testing Experiments Can Lift Conversion

The initial hypotheses were justified. You can implement obtained results at once providing the winner is apparent as in the case of Prisma.

If you want to scale results, run follow-up experiments

A/B testing should become an ongoing process for you rather than a single-shot act. The truth is every app store is a continually changing environment and even the most brilliant results may be dated in less than no time.

For example, Prisma continued tests playing around descriptions and launching further screenshots experiments to prepare for an important update (introduction of video filters).

As a result, the app got additional 19.7% increase in conversion rate.

Tools for efficient A/B testing

Product page A/B testing may seem a monumental challenge as you have to bear in mind dozens of details to make the grade. It’s fortunate that the process of split-testing can be automated. There’re really performant tools that easy up running of experiments.

SplitMetrics is one of such helpmate tools. It facilitates A/B testing process of all page elements in both App Store and Google Play. Generated landings of variations look like actual store pages. Moreover, you can run pre-launch experiments there, thus, “preening feathers” of your new app and predetermining it maximum possible conversion right from the start.

Google Play Experiments is another decent option in case you need to test only Android apps. Though it’s vital to understand that the results of such experiments shouldn’t be applied in iOS App Store as users behavior differ drastically.

Facebook ads may also assist in performance examining of screenshots and icons layouts. Ad campaigns statistics gives insights into converting power of your designs.

App A/B testing is a game changer

Some marketers tend to underestimate the importance of page elements A/B testing forgetting that their team’s opinion on design is biased. Only actual users can prove or dismiss an impact of any idea.

Lots of case studies persuade of split-testing indispensability. Just for the record:

  • Rovio got 2,5M extra installs in just one week after Angry Birds release thanks to screenshots optimization;
  • MSQRD viral success was reinforced with smart App Store Optimization, saying that I refer to 16M installs within 3 months with 0 marketing budget;
  • Prisma got the title of App Store #1 app since its page elements were perfected via A/B testing, as a result its product page lured organic users.

Split-testing doesn’t only boost apps’ performance on both paid and organic traffic, it also becomes a source of essential metrics and analytics which should stand behind every audacious and high-flying ASO strategy.

Guest Author
Alexandra Lamachenka
Alexandra Lamachenka is the Head of Marketing at SplitMetrics, an app A/B testing tool trusted by Rovio, MSQRD, Prisma and Zeptolab. She shows mobile app development companies how to hack app store optimization by running A/B experiments and analyzing results.

Related Posts

App Radar
Features Pricing Learning Blog