A/B testing your Facebook ads can be super fun.
There are so many different ad elements and variations you could test (and play with): The ad copy, images, audiences, placements…
But where should you start? And which testable ad elements help to get the fastest improvement in results?
After helping several startups with their Facebook ad campaigns, I’ve learned that you can’t always A/B test everything.
You need to have quite large ad budgets to get statistically relevant results, meaning that you may have to run a single test for a couple of weeks.
That’s a lot of waiting…
However, you can cheer up yourself knowing that the results you’ll get will likely be highly valuable.
So what are the ad elements that should be given the priority in testing over others?
In my experience, the two most important things to get right are your target audience and your ad visuals. Then, you can refine it even more experimenting with other campaign elements.
To inspire you to get started with Facebook A/B testing, we listed 16 experiment ideas that will keep you busy (and your ad results improving) for months to come!
1. A/B test your Facebook Saved Audiences
Facebook Saved audiences are based on people’s location, demographics, interests, etc.
As it’s the easiest Facebook audience to create, that’s what most advertisers start with.
But your shouldn’t limit yourself to a single target audience. You can easily deliver your ads to people with different interests, genders, or locations – and see which audience converts at the lowest cost.
Facebook split testing tip: When testing different campaign elements, place every variation in a separate ad set, so that Facebook won’t start the auto-optimization before you have your experiment results.
2. Test Saved vs. Lookalike vs. Custom Audiences
Your Facebook campaign’s target audience can make a huge difference in results, especially if you’re testing Saved vs. Lookalike vs. Custom Audiences.
The difference of these audiences is the relevance of your ads. For example, a Custom Audience of past website visitors is more familiar with your product and offers. This also makes them more likely to convert.
When split testing Saved vs. Lookalike vs. Custom Audiences with SaaS startup Scoro, we discovered that the remarketing audiences had more than 200% higher ROI compared to other Facebook audiences we were targeting.
Facebook split testing tip: Create 2-5 different target audiences with little or no audience overlap and use the EXCLUDE feature to exclude Custom audiences for other audiences.
3. A/B test your ad type
Once you’ve discovered the perfect Facebook audience to target, I recommend that you proceed to testing your ad layout and images.
Facebook has more than 10 different ad types and each has its best practices and use cases. Depending on the product you’re selling, you could test Dynamic ads, Lead ads, Canvas ads, etc.
For example, Scoro tested a Carousel ad vs. regular image ads.
Guess which variation returned better results…
Against all the expectations, it was the regular ad that had the highest CTR and 300% lower cost-per-conversion.
Facebook split testing tip: Sometimes, the best tests are the ones you think you already know the answer to. Try experimenting with counterintuitive hypotheses to learn what really works.
4. Split test your ad image colors
A study by Consumer Acquisition found that ad images have a huge effect on your ad results — they’re responsible for 75%-90% of ad performance.
That’s exactly why testing your ad image’s background color can help improve your campaign ROI big time.
For example, Infusionsoft is testing two ad images with a completely different layout and color code.
Facebook split testing tip: Let you experiments run until you’ve got at least 100 clicks on each ad, even better if 300. This way, you’ll ensure that your test results are statistically valid and a variation didn’t win by chance.
5. Split test a stock photo vs. an illustration
Testing a stock photo vs. a custom illustration is a test that you can run after every 3-6 months. That’s because the winning option from the last experiment man have grown old and people are tired of seeing it.
For example, if you’ve discovered that a custom illustration has a 40% higher conversion rate, it can lose its advantage over time when it’s not new and interesting to your audience anymore.
Eventbrite, for instance, is split testing a colorful stock photo against a highly simplistic ad with a custom background.
AdEspresso makes split testing images within your campaign extremely easy. You can simply upload your stock photo and your image into the campaign builder to have a split test created of the two images.
Once the split test is created, you can use the ads reporting tool to analyze the performance of each image in your campaign to see which ones are yielding the best results for you.
6. A/B test in-image text vs. no text
Your ad image is the first thing people notice when coming across the offer in their news feeds. That’s why it could be a smart idea to place your key value offer right in the ad image.
However, it could also work against you, as you might overcrowd your ad.
So what should you do?
Run an A/B test, of course!
For example, Autopilot is split testing two ad variations, one of which lists their product’s top features in the ad image.
Facebook split testing tip: As you may have noticed, many of the examples in this article test multiple ad elements at once. However, I recommend that you experiment with a single ad element per test. Otherwise, you won’t be able to tell which ad element contributed to a specific variation’s success.
Especially if you’re split testing Facebook ads on a low budget, you shouldn’t create more than 3-5 variation per ad campaign.
7. Split test images vs. videos
As we already explained in our article about 18 Facebook ad hacks (you should check it out!), video ads are one of the most overlooked opportunities in Facebook advertising.
A report by Kinetic Social showed that video ads have the lowest cost-per-click compared to other ad types, with an average CPC of $0.18.
Here’s a video ad example by AdEspresso, explaining their product’s benefits. You don’t need to create a complex (and expensive) video, a simple one could work as well.
Facebook split testing tip: Give your A/B tests enough time and budgets, so that your results are relevant. Avoid concluding the experiments too early.
8. Test colorful vs. light ad images
There are arguments that support both options.
Our eyes are naturally drawn to colorful images. However, a light ad image with a white background could really differentiate from the rest of the news feed content, immediately catching your eye.
For example, would you rather notice the light ad or the darker carousel ad by Squarespace?
Don’t limit yourself to testing your ad’s colors. Try removing the background color for good and see what happens.
9. Test the reversed ad image vs. original
In addition to being a fascinating split testing idea, reversing your ad image is also a great Facebook hack for avoiding the ad fatigue.
If you’re unsure how the reversed image hack looks like, see the example below.
After we made a simple ad layout change to Scoro’s ads, we saw a 30% increase in click-through rates and a 20% rise in conversions. I guess our target audience really was tired of seeing the first variation.
10. A/B test your Facebook ad headline
There’s plenty of research telling how you can write better headlines.
For example, by starting your headline with a number, you’re 36% more likely to have people click on your ads.
But you can’t tell for sure unless you’ve tried different options on your own. After all, every brand is different and so are their audience’s preferences.
CoSchedule, for instance, has tested two completely different ad headlines: “A Single Home for ALL of Your Marketing Efforts” vs. “Content + Social Media = Organized.”
Facebook split testing tip: When experimenting with different ad headlines, don’t just change the wording – test completely different offers (e.g. a benefit vs. your product’s top feature).
11.Test ad copy with emojis vs. no emojis
When managing Facebook ads at Scoro, we ran a small A/B test to test whether emojis have any effect on the click-through and engagement rates.
They most certainly did!
According to Scoro’s test results, the ad with emoji in the headline had a 241% higher click-through rate than the ad with no emoji.
This is why I recommend that you test placing emojis in your ad headline, but also the main ad copy.
Adding emojis to your Facebook ads is super easy:
- Find an emoji in Emojipedia
- Copy it to your clipboard
- In the editing phase, paste the emoji into your ad copy
12. A/B test short copy vs. long copy
While Facebook allows up to 500 characters in the main ad text, it doesn’t mean you have to add that much text.
Short and clear ad messages require less effort to be read, so people tend to like them even more. Especially when they encounter your ad in the crowded newsfeed.
For example, Intercom has an interesting way of testing different ad copy lengths or leaving some text placements completely unfilled.
Facebook split testing tip: As a general Facebook A/B testing rule, you should keep your ad images and placements similar when testing different ad copy variations.
13. Split test your landing pages
Getting people click on your Facebook ads is only a part-time victory. You also need them to sign up on your offer or complete a purchase.
After someone clicks on your ad and lands on your website, you have about 10 seconds to convince them to stay. That’s when landing page optimization enters the game.
You can split test with various landing page changes or drive your ad viewers to completely different pages, e.g. your homepage vs. a dedicated landing page vs. a product description page.
You could also experiment with leading people to a web page with a sign-up form.
14. Experiment with your ad’s CTA button
When setting up your Facebook ad campaigns in the Ads Manager, you can select between multiple call-to-action buttons:
- Sign Up
- Contact Us
- Learn More
Usually, your offer matches with several CTAs, so which one should you choose? And does the choice of your call-to-action button really have any effect on the results?
Well, sometimes, a CTA can make all the difference.
By conducting an A/B test, Scoro found that while ads with the Learn More CTA had a higher CTR, it was the Sign Up CTA that had a 14.5% higher conversion rate.
Facebook split testing tip: When concluding your Facebook tests, look at the right metrics. It’s not the cost-per-click or click-through rate you should track. These metrics do not necessarily translate into sales. Always measure the cost-per-conversion when A/B testing your Facebook ads.
15. Split test different campaign objectives
Facebook has more than ten different campaign objectives, from brand awareness to mobile app downloads.
And according to AdEspresso’s analysis of Facebook ads cost, different campaign objectives can lead to different cost-per-click results.
When testing different campaign objectives, you need to create multiple Facebook ad campaigns.
Make sure than the rest of the campaign elements (as well as ad elements) remain the same across all your test variation for relevant and actionable results.
16. Split test your Facebook ad placement
According to the latest Facebook news, the social media platform has more than 1.9 billion users.
Not all of these users have the same usage patterns – some of them prefer to browse Facebook’s newsfeed from their computers while other like a mobile browsing experience.
Your job as a marketer is to find out where your target audience likes to hang out.
And to do that, you should split test your Facebook ad placements to know which one returns the highest results.
In the image below, you can see that the Mobile + Audience Network placement had a 212% higher click-through rate. However, it was the Desktop placement that brought all the conversions. The winner? – Desktop newsfeed.
Facebook split testing tip: Before you start A/B testing your Facebook ads placement, double check your mobile advertising best practices checklist and ensure you have a mobile-friendly landing page.
When experimenting with different ad placements, your ad copy can vary across different placements, News Feed ads containing the most text.
As you can see, there are tons of different options for testing and optimizing your Facebook ad campaigns.
Here’s a quick overview of all the suggestions covered in this article:
- A/B test your Facebook Saved Audiences
- Test Saved vs. Lookalike vs. Custom Audiences
- A/B test your ad type
- Split test your ad image colors
- Split test a stock photo vs. an illustration
- A/B test in-image text vs. no text
- Split test images vs. videos
- Test colorful vs. light ad images
- Test the reversed ad image vs. original
- A/B test your Facebook ad headline
- Split test ad copy with emojis vs. no emojis
- A/B test short copy vs. long copy
- Split test your landing pages
- Experiment with your ad’s CTA button
- Split test different campaign objectives
- Test your Facebook ad placement
And now, it’s time to put all the new insights into action and set up a Facebook ads experiment!