Conversion Optimization: Eight considerations to take into account when A/B testing in mobile

By Daniel Burstein

I’m writing this article on a laptop computer at my desk. And in your marketing department or agency, you likely do most of your work on a computer as well.

This can cause a serious disconnect with your customers as you design A/B tests.

Because more than half (52.4% according to Statista) of global internet traffic comes from a mobile device.

So, I interviewed Rebecca Strally, Director of Optimization and Design, and Todd Barrow, Director of Application Development, for tips on what considerations you should make for mobile devices when you’re planning and rolling out your tests. Rebecca and Todd are my colleagues here at MECLABS Institute (parent research organization of MarketingExperiments).

Consideration #1: Amount of mobile traffic and conversions

Just because half of global traffic is from mobile devices doesn’t mean half of your site’s traffic is from mobile devices. It could be considerably less. Or more.

Not to mention, traffic is far from the only consideration. “You might get only 30% of traffic from mobile but 60% of conversions, for example. Don’t just look at traffic. Understand the true impact of mobile on your KPIs,” Rebecca said.

Consideration #2: Mobile first when designing responsive

Even if mobile is a minority of your traffic and/or conversions, Rebecca recommends you think mobile first. For two reasons.

First, many companies measure KPIs (key performance indicators) in the aggregate, so underperformance on mobile could torpedo your whole test if you’re not careful. Not because the hypothesis didn’t work, but because you didn’t translate it well for mobile.

Second, it’s easier to go from simpler to more complex with your treatments. And mobile’s smaller form factor necessitates simplicity.

“Desktop is wide and shallow. Mobile is tall and thin. For some treatments, that can really affect how value is communicated.” — Rebecca Strally

“Desktop is wide and shallow. Mobile is tall and thin. For some treatments, that can really affect how value is communicated,” she said.

Rebecca gave an example of a test that was planned on desktop first for a travel website. There were three boxes with value claims, and a wizard below it. On desktop, visitors could quickly see and use the wizard. The boxes offered supporting value.

But on mobile, the responsive design stacked the boxes shifting the wizard far down the page. “We had to go back to the drawing board. We didn’t have to change the hypothesis, but we had to change how it was executed on mobile,” Rebecca said.

Consideration #3: Unique impacts of mobile on what you’re testing

A smartphone isn’t just a smaller computer. It’s an entirely different device that offers different functionality. So, it’s important to consider how that functionality might affect conversions and to keep mobile-specific functionality in mind when designing tests that will be experienced by customers on both platforms — desktop and mobile.

Some examples include:

  • With the prevalence of digital wallets like Apple Pay and Google Pay, forms and credit card info is more likely to prefill. This could reduce friction in a mobile experience, and make the checkout …read more

    Read more here:: MktgExperimentsBlog

Leave a Reply

Your email address will not be published. Required fields are marked *

CommentLuv badge