Site icon AdEspresso

Does Engagement On Facebook Ads Increase Their Performance? A $1,000 Experiment

Facebook and Instagram are two of the largest social media platforms around, with the keyword being social.

Social proof is the likes, comments, and shares that the ad itself gets as users can engage with an ad in the same way they do a post they see from their friends and family.

Will an advert with lots of engagement result in more clicks and conversions than an identical ad with no social proof?

Read on to find out!

Sometimes as digital marketers it’s easy for us to get wrapped up in thinking about pixels, algorithms and campaign optimization, and forget that the key aim of Facebook is for people to connect and engage with one another via posts and comments.

So, in this experiment, we’re going to investigate how important social proof is for ad performance.

Experiment Hypothesis

As you scroll through your Facebook newsfeed, approximately one in every five posts that you see is a sponsored post.

As the average Facebook user spends approximately 38 minutes a day on the platform, that means there is the potential to scroll past dozens of ads. Indeed, nearly $17.4 billion was spent on Facebook advertising in the third quarter of 2019.

How do advertisers cut through the noise generated by that huge volume of ads?

Often we aim to produce “thumb-stopping content” i.e. an ad that stops people in their tracks when they’re thumbing through their newsfeed. If you can get the attention of a user for a couple of seconds, then you have a chance to hook them in with the features and benefits of your product and a compelling call to action.

Does an ad with a high number of likes, comments, and shares help it to stand out in the newsfeed?

We hypothesized that social proof will make a difference as it indicates a post is interesting and engaging to others and probably of higher than average quality.

Our hypothesis is that an ad with more social proof will achieve more conversions, all things being equal.

The visuals, text, and call to action for the advert may have more impact, but if there are two identical ads the one with the most engagement should get more clicks and conversions.

Experiment Setup: The Adverts

In this experiment, we wanted to test two identical ads, so that the only variable was the amount of engagement on each ad.

We started by creating two copies of an ad used in previous experiments, this creative has been tested and gives a low cost per lead, allowing us to collect plenty of conversion data.

Here’s the ad that we created, it’s a standard image ad with a “download” call to action button:

On one of the copies of the ad, before putting any ad spend behind it we asked AdEspresso customers to leave a positive comment on the ad if they had previously downloaded and enjoyed the eBook.

We then boosted the ad for page post engagement, and by using a worldwide audience we added 3,200 post likes for $14.

In total the ad had 3,200 likes, 18 positive comments and 10 shares which were visible to anyone that saw the post:

To be clear, we didn’t ask for any false reviews as that is against the law in some countries and could also harm the relationship you have with your prospects.

We also didn’t buy fake likes as these are against Facebook’s terms of service and could also reduce your credibility.

Experiment Structure

The aim of the campaign was to drive traffic to a landing page and get signups for our Ultimate Guide to Custom Audiences eBook.

This would allow us to track and optimize for leads, with the winning ad being the one with the most conversions.

Campaign structure:

We put the two ads into separate campaigns so that we could allocate an identical amount of budget to both the versions (with and without social proof ).

Although this could cause some internal competition, the amount should be small given the large audience size and relatively low budget.

Audience

A 2% lookalike was created based on current 180-day purchasers of AdEspresso. We targeted US-only, males and females, ages 21-64.

Current leads and AdEspresso customers were excluded. The audience size available to target was 3.2 million.

Placement:

Facebook mobile newsfeed.

Budget:

We allocated $75 per campaign per day, giving a combined daily total of $150 for approximately 7 days.

Bid Strategy

Lowest cost, no cap.

Optimization:

We optimized for conversions, tracking the Lead event.

Experiment Results

Both ad variations generated over 100 conversions, with the main results being summarized in the table below:

Social proof on the ad decreased the CPM (cost per 1,000 impressions), cost per click and cost per conversion and resulted in more leads in total.

The ad without social proof had a 7.7% increase in cost per conversion.

It should be noted that this result isn’t statistically significant, there is a chance these results are down to random variation as the difference is small.

Experiment Conclusions

The main conclusion is that – as predicted:

engagement on the ad does result in a lower cost per conversion, but the difference is small at 7.7%, and there’s a potential for a random variation to be influencing the results.

Therefore we would still look to re-use ads that have built up social proof, but it wouldn’t be our only focus.

Instead, the way to reduce advertising costs as much as possible is to do as much split testing as is practical.

By way of example, in our previous $1,000 experiments, we found that selecting the best ad format is 2.8 times cheaper than the worst format, similarly testing video thumbnails can halve the costs and even the call to action button makes a 2.5X difference to the cost per lead.

Only once you’ve done several rounds of split testing and no longer see any significant reduction in CPA, it’s worth picking the very best ads and rolling them out to multiple high-quality audiences.

Having positive comments and likes and shares on the post is the icing on the cake but not as important as the ad creative or an optimized landing page.

Final Thoughts

One final thing to mention is that this experiment only considered if existing social proof on identical ads influenced the behavior of users.

What we couldn’t take into account is whether getting lots of interaction during the campaign would change the outcome.

This chart shows how much engagement each ad got during this experiment, the rates are very similar:

The relevance metrics are also identical:

Potentially an ad that got lots of likes, comments, and shares during the campaign as it’s running, could get higher relevance rankings, which in turn could result in a lower CPM and cost per click and therefore more leads.

So, there’s still a case for making ads as engaging as possible.

Have you found that social proof on an ad makes a difference? Or does it have little effect? Let us know your thoughts in the comments below.