Product and mobile app analytics insights from industry experts
Globe iconEN
  • America IconEnglish
  • Brazil IconPortuguês
  • Spain IconEspañol

8 MIN READ

SHARE THIS POST

Your Full Guide to Mobile Website Conversion Optimization

PUBLISHED

24 July, 2019

ed638cb6b559face865e76c8feac253c?s=150&d=identicon&r=g
Disha Sharma
Full Guide to Mobile Website Conversion Optimization

Most optimizers believe that things that work on desktop work on mobile too.

But at Convert.com, where we power thousands of mobile website A/B tests, we know this doesn’t hold for most websites. In fact, we regularly see checkout pages with double digit conversion rates on desktops hardly converting on mobiles. We also see many successful lead-generating desktop pages reporting alarming dropoff rates on mobile devices.

Yet a majority of optimizers don’t challenge the status quo when it comes to optimizing for conversions separately for desktops and mobiles. And, as a result, miss out on a lot of mobile conversions.

If you analyze your Google Analytics data and compare the behavior of your desktop users with your mobile ones, you’ll realize this too. Just see how your desktop conversions stack up against your mobile conversions for the same pages.

If your data tells you something, then it’s this: Optimizing a mobile website for conversions — in most cases — needs a mobile-first conversion optimization strategy. And that’s what we’re sharing with you in this mobile website conversion optimization quick guide.

Let’s dive in!

Step #1: Conducting Research

As with general conversion optimization, even your mobile website conversion optimization begins with conducting research.

So in your first step toward optimizing your mobile website for conversions, you need to perform some in-depth research and gather data about your mobile users. This data will help you spot the optimization opportunities you can tap into to get more mobile conversions.

And because good data has both qualitative and quantitative inputs, you need both.

Quantitative data — or simply numeric data — pinpoints you to the spotty patches in your mobile website conversion funnel.

Your best sources for such quantitative data are analytics solutions like Google Analytics. These solutions give you metrics that you can use to form insights for your hypothesis (which we’ll get to in the next step!). But for now, keep these few telling metrics in mind:

  • Dropoff rates on the important pages of your mobile website

  • Conversion rates on landing pages of your mobile traffic segment

  • User actions like clicks (reported via features like events) across your mobile website

You can also use such tools to compare your desktop conversion rate with your mobile conversion rate.

After some thorough quantitative data-mining, you should be able to spot three to five pages or areas of your mobile website that could be optimized for more conversions.

Once you’ve identified them, it’s time to collect some supportive qualitative data.

Qualitative data — or data that you extract from learning from your users’ actual behavior and feedback — shines the spotlight on what’s going on behind the quantitative data.

To gather qualitative data, you need to engage your users and interview them about using your mobile website. Experts recommend that interviewing just three to five users can also uncover all the glaring issues your users experience. In addition to interviewing your users, you can also try user testing solutions like UserTesting that connect you with test users who match your target demographics and give you feedback about your mobile website experience.

You can also use surveys and feedback forms (on-page, email, and exit) that let you ask open-ended questions to your users to collect qualitative data from them. Polls are also helpful for such learnings.

To choose the right tool for your qualitative learnings, refer to this quick table from CXL:

And finally, you’ve tools like heatmaps, scrollmaps, user sessions and more that let you “see” how your mobile traffic engages with your website. Solutions like Hotjar show you exactly where your users click, get stuck, scroll impatiently and so on… on your mobile website. If you’re looking to optimize your mobile app experience, then you’ve exclusive mobile app analytics solutions like UXCam that let you capture every such user interaction on your app.

These qualitative data sources, tools, and solutions help you identify your users’ top friction points that can be addressed in your experiments to boost your mobile website’s conversions.

And unlike what you might think, you don’t need tens of research and analytics tools to gather such data. In fact, at Convert.com, most of our customers run tens of high impact A/B tests using insights from just free solutions like Google Analytics and UXCam. You, too, can use a lean stack and still get high returns from your experiments. It’s ok to start nimble.

Step #2: Forming Hypothesis

Once you’ve spotted the top page(s) to optimize for conversions for your mobile website, it’s time to write a hypothesis.

A good hypothesis is always data-backed and uses a bunch of data points to recommend a change that will improve conversions. So, for instance, if a B2B mobile’s website’s lead generation page shows a poor conversion rate and a high dropoff in Google Analytics, and also bad reviews in mobile user experience surveys, then this data would make for a good hypothesis.

Following on from such data, you could suggest running an A/B test testing a completely revamped page against the original one that could potentially result in more conversions.

Top conversion optimizer Craig Sullivan gives an excellent template for writing a data-backed hypothesis. He recommends writing a 3-point hypothesis that 1) explains the data, 2) lists the possible impact, and 3) specifies the deciding primary metric that will tell if the change is successful and makes the expected impact.

Image source

So, for instance, a filled-out hypothesis for a mobile website conversion optimization experiment for an online store’s checkout page could look like:

1. Because we saw multiple mobile user sessions showing unusual levels of scrolling, sometimes leading to a tap at the purchase button

2. We expect that simplifying the cart page layout and making the buy button more prominent will cause more people to tap at the purchase button and proceed in their checkout journey

3. We’ll measure this using the completed purchases/sales metric.

Sullivan also recommends a more detailed version of the hypothesis (“Advanced Kit”) where he digs deeper into his simpler 3-point hypothesis template. In this version, he focuses on using balanced research data that includes both qualitative and quantitative data. He stresses diving deeper into the users mindset and identifying target segments in them that are expected to be impacted by the experiment. Finally, in his advanced hypothesis template, he also recommends estimating the duration (x business cycles) over which the data metrics are expected to change.

Depending on where you are in your conversion optimization journey, you can pick either of the two versions. For those just starting out with mobile-first experiments, the first version would be more suitable.

You could also try this cool hypothesis generator from the Conversionista! team.

As long as you base your hypothesis on data, suggest changes based on issues that your data points, and choose a meaningful metric that improves your business’s bottom line, you should be on your way to a good optimization experiment.

Step #3: Creating Treatment

Once your hypothesis is ready, it’s time to create your treatment — or the version(s) you’ll be testing against the original one.

With an A/B testing tool like Convert Experiences from Convert.com, you can easily create different mobile versions to test based on your hypothesis. Our easy-to-use visual builder doesn’t just help you create (design, develop, and deploy) your challengers but also shows you exactly how they’ll look on the mobile and tablet devices you’re targeting.

Other than Convert Experiences too, you’ve many options like VWO, Optimizely, and A/B Tasty. But these tools come with (annual) five-figure contracts and don’t allow monthly billing, so they’re mostly suitable for enterprise-level businesses. That said, you also have a few good free options like Google Optimize that let you do such testing.

When creating your treatment, most optimization tools will ask you to set your experiment’s goals. The main goal or metric of your experiment will be the one you identified in your hypothesis — the one that matters to your business. But in addition to that, you can set additional metrics as well.

At Convert.com, our customers, on average, set about four goals for every experiment.

So, for instance, if you’re optimizing a product page on your online store for a better mobile conversion rate, you’d set your main metric or goal to be the number of sales.

But in addition to tracking completed orders, you can also track if there’s any change to your add-to-cart metric. Or to your add-to-wishlist metric. Quite a few times, tracking multiple metrics like this helps you understand how your experiment influenced your buyers’ behavior across their full buying journey.

Step #4: Running the Experiment

Once you’ve created your treatment, you need to decide on your sample size and test duration. There are many CRO tools that let you estimate these. You can use our free sample size calculator and experiment duration calculator too.

One thing worth noting about the experiment duration is that while most optimizers try to run experiments until they reach (at least) 95% statistical significance, most experiments (80%) don’t reach statistical significance. Mostly they are stopped when optimizers see “clear winners.” At other times, they simply don’t reach statistical significance.

So while you might discover that your test should run two weeks, you might very well end up stopping it much sooner.

Step #5: Analyzing Results

Once an experiment ends — which, in 80% of the time, happens when optimizers see a clear winner — it’s time to understand its impact on the primary metric.

This primary metric (for e.g., revenue for online stores or form submits for business websites) helps you unmistakably identify the winner from the different versions tested.

But while this reporting is enough to spot a winner, you can do more.

For instance, you could revisit every data point that contributed toward forming your hypothesis and gauge the impact of the experiment on them as well. For example, if in your hypothesis stage, you used your mobile users’ sessions to identify frustrations with, say repetitive taps, you should analyze your user sessions during the experiment for both the control and the challenger(s). The winning version should report fewer instances of such user frustrations.

Not just that, if you use your mobile traffic page dropoff metric for your hypothesis, then ideally with an improved mobile version, your dropoff rate should improve too.

Post experiment analysis for the above online cart page A/B test didn’t just show higher conversions for the winning version (the second one) but also showed visibly lower user frustration levels hinted at by smoother scrolling patterns.

You get the idea, right?

So That’s a Wrap!

Mobile websites are no more lite versions of the desktop ones. Users expect them to be as functional as the full-scale ones they access from their 13.3-inch laptops. Whether it’s making a purchase or filling out a form, users want to be able to “do it” on their mobiles.

Any business — even yours — that gets mobile traffic and experiences high dropoffs in it, especially on key pages (such as the cart page for an eCommerce store or the landing page for a B2B business), must work on optimizing its mobile website for conversions.

Failing to do so doesn’t just result in bad website experience for your mobile users, but also in lesser business for you as having a more optimized mobile website is a fast-growing competitive advantage.

So tell us – have you ever run any mobile website conversion optimization experiments? And what are your thoughts on mobile-first optimization?

AUTHOR

ed638cb6b559face865e76c8feac253c?s=150&d=identicon&r=g
Disha Sharma

Disha is the marketing writer at Writerzone. She spends her time focused on traffic driving topics and learns all about optimization from the billions of experiences.

Get the latest from UXCam

Stay up-to-date with UXCam's latest features, insights, and industry news for an exceptional user experience.

First name
Work email*
*Required
Thanks for submitting the form.

Related articles

Product best practices

How to Increase Product Engagement - A Step-by-Step Guide

Building a mobile app product? Learn effective strategies to increase product engagement and boost user retention with our comprehensive...

Tope Longe
Tope Longe

Growth Manager

Product best practices

User Journey Map Guide with Examples & FREE Templates

Learn experience mapping basics and benefits using templates and examples with mixed-methods UX researcher Alice...

Alice Ruddigkeit
Alice Ruddigkeit

Senior UX Researcher

Product best practices

How to Set Up Google Analytics for Mobile Apps

We’ve made it easier for you to make an educated decision about the right analytics platform for your app insights by presenting a comparison of Google Analytics for Mobile Apps, Firebase Analytics, and...

Celia author profile
Celia Murray

UXCam logo

Products

Compliance

    Logo SOC2

    UXCam has successfully completed a SOC 2 Type 2 examination by Johanson Group.

Sign up for our newsletter

First name
Work email*
*Required
Thanks for submitting the form.
CONNECT WITH US:

© 2024 UXCam. All rights reserved.

Privacy policy.

Terms of service.