Rovio, MSQRD and Others Use A/B Testing For Pre-Launch App Marketing. Here is Why

The following is a guest contributed post from Liza Knotko from Splitmetrics.

App Store Optimization (ASO) has undergone substantial transformations. It was enough to use proper keywords and put to work all five screenshots for decent optimization several years ago. However, such philosophy will lead you nowhere today.

Developers have to be preoccupied with app reputation management, profound market analysis, and everlasting testing to survive insane mobile apps competition. Thoughtful A/B testing comes into sharp focus in the context of app pre-launch marketing.

Nevertheless, marketers usually deal with avalanche of tasks before a new app release putting all split-testing and the majority of ASO activities on the back burner. Lots of publishers fail to understand that intelligent A/B testing can facilitate most of pre-launch tasks: from product ideas validation to ensuring stellar conversion rate when an app is not even live.

Indeed, there are dozens of ways split-testing can be used fruitfully before an app is in a store. Here are the most important pre-release A/B testing functions that can take your pre-launch game to the brand new level.

App pre-launch marketing starts with validating product ideas

A/B testing is at hand when it comes to qualifying ideas for an app. Thus, developers can track users reaction to various concepts, characters, features,messages, marketing banners, etc. and maximize their CVR.

This evaluation of ideas saves time and budget giving publishers a unique chance to give up doomed undertakings or change the direction of their app marketing and development.

Perfecting store product page of an app

Lots of publishers tend to underestimate the impact on conversion of such product page elements as screenshots, icons, descriptions, etc.

It’s a funny thing: when it comes to keyword optimization during pre-launch app marketing activities developers hardly ever merely trust their instincts. No, they’re all into thorough market research and audience analysis which identify keywords that represent an app in the best possible way. For instance, they find out that ‘pizza delivery’ performs better than just ‘pizza’, or ‘run tracker’ beats ‘running app’.

However, this diligence vanishes when these very publishers get down to screenshots, icons, and other product page elements. Final designs and texts usually represent a highly subjective opinion of the team which promotes an app and are almost never based on studies and specific numbers. This inability of conveying app’s core ideas via product page results in drain of thousands of organic installs.

Publishers have a chance to identify what icon or set of screenshots make their app page more attractive for target audience. After all, users’ opinion is what really matters.

Marketers can either test performance of screenshots and icons designs via Facebook ads banners or opt for such platforms as SplitMetrics which emulate app store and help to maximize converting power of a whole product page even within pre-launch phase.

For instance, Rovio took advantage of the latter method and played around screenshots. They tested portrait against landscape screenshots. A/B testing discovered that vertical ones performed way better which contradicted game industry trend. The truth is ‘Angry Birds’ users are not hardcore gamers and they are not used to landscape screenshots in the store.

Thus, preventing a fatal error of using originally planned landscape screenshots before major release, ‘Angry Birds 2’ got 2,5 million of additional installs in just a week after the launch. Impressive result, isn’t it? This case proves that product page elements are definitely worth testing.

Developing right positioning

Apps usually have quite a few features, so it’s hard to understand which one will resonate most with your target audience. A/B testing helps to leave all doubts behind and experiment with various messages highlighting different features.

The photo and video filter app MSQRD ran a series of A/B tests to identify the most appealing mask from the range this app planned to offer. The monkey mask won as 2016 New Year’s Eve was approaching and it’s the year of the monkey according to the Chinese zodiac. The team placed this filter first in their screenshot set ensuring it’ll be the first thing potential customers would see.

The right positioning from app’s early days means not only loyal customers, but also an impressive performance on both paid and organic channels.

Choosing the most efficient ad channels

A/B testing is essential for determining the most performant marketing channels. We’ve seen companies using split-tests to qualify different ad channels like news, Facebook, cross promo, and various ad networks. At comparing conversion rates, publishers decide to the best one for promotion of their coming out soon apps.

Split-testing analytics may also help in polishing marketing strategy and identifying not only which sources have better conversion but also which ones bring more loyal users. Thus, marketers can put ad channels experiments behind and enjoy high-powered ad campaigns by the time their app is live.

Identifying your ideal audience

When it comes to determining the target audience, pre-launch A/B testing of a store product page is always a good idea. Experiments on users of different age groups, sexes, locations, etc help to find out what type of users are more likely to download an app. These data can and should be used in further marketing campaigns once your app is available in a store.

Such pre-launch A/B testing can also assist in collecting emails of potential users and building primary base  base of app early adopters. It’s possible to use these contacts for surveys and these people should definitely be the first to know about the release of your new app.

Analyzing user behaviour

Using scroll and hit maps of app A/B testing platforms, publishers get precious insights on users interaction with app’s product page clearing up what can be altered for the sake of better app performance.

Moreover, such pre-launch A/B testing analytics assists in uncovering app page bottlenecks. Developers have a chance to determine potential problems of their app and points of growth in a short space of time.

For instance, Darby (an app with DIY videos) discovered that their icon misled customers. A “play” sign in the icon made audience believe Darby was a video editing app. A/B tests helped to nip this problem in the bud and the icon was altered completely.

Pricing policy

This point is extremely important for aspiring paid apps. It’s vital to identify the price which won’t scare potential users and will help to earn more eventually. A/B testing different prices before app’s launch is one of the best options. It may even result in change of app’s policy and waiver of paid form in favour of free model with in-app purchases.

A/B testing should become a faithful friend and companion of every publisher during app’s pre-launch phase.

Applying it, app marketing gurus prepare a product page of an app for the release in the best possible manner maximizing its converting power. Split-testing doesn’t only gives trustworthy results, it also makes a decision-making process more transparent and effective sparing the pain of team arguments and conflicts.