I’m sure you’ll agree:
There’s no better way to increase landing page conversions and ROI than by split testing different page elements.
And yet, in spite of running tens of tests you never seem to get results even remotely close to what others are reporting.
Adding another image didn’t convince more visitors to buy.
And in spite of moving the form to the right, the number of signups hasn’t increased by a bit.
So what’s wrong?
The answer is actually quite simple – you’re testing the wrong things.
And in this week’s post I’m going to show you what not to split test on your landing pages.
Ready? Let’s roll it then…
It’s only natural that since headline is the most important element of a landing page, you may want to test it to find one that converts best.
The problem with that approach is that by testing different headlines you create disparity between your landing page and an ad.
I already explained the reasoning behind this here so just to reiterate now:
The first question anyone landing on your page from an ad is going to ask is, am I in the right place?
And then, the person will most likely begin looking for cues that the page can actually deliver what the ad has promised.
Using different headlines will most likely suggest that the two are in fact disconnected and will make the person bounce off.
Because, as Dr. Ed Chi, a Xerox Palo Alto Researcher suggested, we humans follow information in a similar way as animals follow a scent. This behavior is called “hub and spoke” surfing and here’s how one article explains it:
“They [humans] begin at the center, and they follow a trail based on its information scent…. If the scent is sufficiently strong, the surfer will continue to go on that trail. But if the trail is weak, they go back to the hub.”
According to Dr. Ed Chi, people follow this process until their needs are fully satisfied.
By testing different headlines you break off the scent and create a sense of irrelevancy, suggesting that your visitor might have landed in a wrong place after all.
That’s why in this test, a headline matching the ad outperformed the other, generic one by 115%.
The above however doesn’t mean that you can’t split test your headlines.
When you do, you should ensure that you use the same headline variations in your ads and corresponding landing pages.
For some headline A/B tests ideas, check out my last week’s post here on AdEspresso.
In online advertising, an ad’s image is an extension of its core message.
Just like with a headline, users use images to determine the landing page’s relevancy to the ad.
Seeing the same image they saw on the ad will convince them that the two are relevant.
But a different image might create a sense of disconnection between the two and push a user off the page.
At the same time, images can affect conversions. And for that reason alone it’s important that you test different ones:
Images featuring your product,
Less cluttered images, featuring fewer elements,
Images with close ups showing people’s emotions,
Images where the person’s looking towards the call to action.
Product close-ups and many more.
However, every time you test a different image, make sure you use the same one in the ad as well.
Testing different color variations is in fact the most common split test.
It seems that the first thing every A/B testing newcomer tries is how a call to action button is going to perform if it’s red (or in any other color at that).
It comes as no surprise – after all, the Web is full of examples of similar experiments.
They’re all based on a wrong premise.
It’s true that colors affect our emotions:
- Brown equals warmth and clarity,
- Red, excitement and action,
- Blue means trust,
- We typically associate green with health or growth and,
- Silver suggests a calm balance.
But when it comes to your Call to Action (and other elements on a page too), their color doesn’t matter.
What matters however is whether a button stands out on a page and visitors notice it right away.
Colors could certainly help to achieve that but ultimately, what color you choose is less important than how much contrast from other elements on a page it provides.
The same goes for the button’s copy.
It doesn’t matter what color it is set out in as long as:
- The button is clearly readable and,
- It entices a person to take action.
So instead of testing colors, split test a button’s visibility. If this means changing its color, do so. it may as well involve testing different sizes or copy as well.
It’s a common cliche:
The position of elements on a page affects their performance.
Perhaps there is some truth in it.
But there’s something way more important to placement:
In other words, swapping elements from left to right might make a difference but making them easier to use will make an even bigger one.
Take an email form for example.
Make it too long and regardless where you place it on a page, users still won’t complete it.
The same goes for a checkout:
According to Statista for instance, 8 of 14 most common reasons for shopping cart abandonment are related to poor checkout experience.
Therefore, instead of shuffling forms and other elements from one side of the screen to the other, consider running tests that improve their usability:
- Test shorter version of the form,
- Try out different form headlines to see which one convinces users to act,
- Do the same with the form’s Call to Action button,
- If you have to use a longer form, see if splitting it into a number sections or pages improves conversions,
- Split form fields into two columns and so on.
Here are few examples of well designed forms (that could have been placed anywhere else on a page):
I’m sure you’ll agree:
Writing ad copy can be challenging.
For one, character limits often make it impossible to include all the information you’d want. Not to mention giving you space to use the language your audience would resonate with.
But those limitations don’t exist on landing pages.
And thus it might be tempting to test language variations. This often includes testing different headlines but also:
- Different discount or other incentive. Your ad might promise 15% discount, but faced with low take on this you might decide to test if changing it to 20% would make a difference.
- Adding different social proof. For instance, featuring testimonial on the ad but customer logos on a landing page.
- Using urgency on a landing page (but not on the ad). For instance, your ad might offer a “free trial” but you might decide to test different call to action (i.e. “sign up for a trial now”) on a landing page.
The problem with these tests is that, once again, they create discrepancy between the language of the ad and the wording on the landing page.
For instance, “Free trial” suggests a quick way to test the app at no cost. But using “Sign up for trial now” might tell the user that they may have to commit to the product long-term.
Therefore, even if you decide to test the wording on a page, make sure that you use a consistent message between the ad and the landing page.