As email marketers, we do A/B testing to iterate on what works and discard what doesn’t. So, whether you’re looking to increase revenues, open rates, clicks, or generate new leads, A/B testing can help get you there.
Unfortunately, marketers often get stuck trying to find the right test to drive the biggest impact. Or, they run out of new ideas causing their A/B testing to become repetitive.
So, what’s the recipe that will deliver high-impact results?
Truthfully … there is no one-size-fits-all formula.
What works for one company, may not work for your business. [We know that’s not what you wanted to hear.]
Although we can’t provide you with examples that will guarantee results, we can provide you with some actual results from our own A/B testing experiments.
Over the last few months, we ran several A/B tests. Here are six of them.
Just so you know, we usually keep these results hush-hush, but because we want you to succeed … we decided to share it with our readers.
Hopefully it will inspire you to explore new experiments of your own.
First, What is A/B Testing?
Also known as “split testing” or “bucket tests,” A/B testing is the method in which you compare two variants, A and B against each other. The two versions are identical, except for one variation or change that may impact a user’s behavioral pattern. Usually the A test is controlled while the B version of the test has been modified in some way. An example would be in a call-to-action where A is “Free Webinar” and B is “Get a Free Webinar.” The goal for this A/B test may be to measure the click-throughs and/or conversion rates to identify which call-to-action performed better. From there, you’d be able to iterate on the results so you can optimize your campaign further.
Now, as promised, here are six A/B testing experiments we ran in-house using Clickback’s own Email Lead Generation software.
Experiment #1
Goal: Double sales-qualified lead conversions
The Experiment: Design a new email template

Results:
We ran this A/B testing experiment in eight separate campaigns. Measuring conversions, we were able to increase our lead growth by 150%. Validation that a fresh new look had a major impact in us filling our sales team’s pipeline.
Experiment #2
Goal: Double open rate
The Experiment: Use a token in the subject line
Original Subject Line (Test A): Here’s How to Avoid Rejection in Sales
Variation Subject Line (Test B): [first name token], Here’s How to Avoid Rejection in Sales
Results:
With Test B we were able to double our open rate, which was our goal. But, we wanted to know if we could do better. So we iterated on this for our third experiment. But you’ll have to read on to find out how.
Experiment #3
Goal: Increase open rate
The Experiment: Use recipient’s company name in subject line (vs. first name)
Original Subject Line (Test A): [first name token], Quench Your Customers’ Thirst
Variation Subject Line (Test B): Quench the Thirst of [company name token]’s Customers
Results:
For this A/B testing experiment, we increased our open rate by 200%. And that was just from one campaign. To verify the results of this test, we ran this campaign three separate times and the data remained consistent. In fact (on average) our test B campaign had 317% more opens than test A. The conclusion – our recipients’ were more engaged when their company’s name was mentioned in the subject line than their own.
Experiment #4
Goal: Double our sales-qualified lead conversions by increasing our click-through rate
The Experiment: Use an image of a human (vs. non-human image)

Results:
The tweak for this A/B testing experiment was pretty simple. And again, test B outperformed, increasing our email click-through rate by 29% and boosting our sales-qualified lead conversions by 100%. It was definitely a win-win.
Experiment #5
Goal: Increase revenues by securing one customer
The Experiment: Write more creative content
Original Content (Test A):

Variation Content (Test B):

Results:
The winner? Test B. Once again.
We shortened the copy – slightly – and made the content more conversational, increasing our sales-qualified lead growth by 125%. We also secured 1 customer, after running this campaign only twice. Our click-through rate also saw a boost of 36%. Looks like content really is king.
Experiment #6:
Goal: Increase sales-qualified lead conversions by 150%
The Experiment: Purchase new cold contacts and expand our email list
Results:
We know this wasn’t a true A/B test. However because we did such a good job optimizing our previous five experiments, we decided to replenish our email list, which was over a year old. The goal here was to boost our sales-qualified lead conversions by 150%. And we did. Truth is, we ended up increasing the volume of new leads to our sales team by 325%. Our sales department was not only happy, but ecstatic. They ended their month with a new record of most closed customers. Ever.
Now, It’s Your Turn
We provided you with some stellar results using our own in-house A/B testing experiments as examples. Although we can’t guarantee the same results, you’ve got some fresh ideas to help you get started.
