A/B Test Your Email Marketing Campaigns

A/B Test Your Email Marketing Campaigns

Twitter Facebook LinkedIn
Would you like to improve the open and click-through rates of your email campaigns?
It’s a rhetorical question, of course. Better open and click-through rates result in more website visitors and sales, and every marketer wants that.
But how do you do it?
One way is to start running A/B tests on your email campaigns. In this guide, we’ll show you what A/B testing is and how it can improve your open and click-through rates, as well as arm you with a number of ideas for A/B tests you can run on your email campaigns to get better results.
Chapter 1
What is A/B testing and why should marketers care?
A/B testing, in the context of email, is the idea of sending one variation of your campaign to a subset of your subscribers, and a different variation to another subset of subscribers, with the ultimate goal of working out which variation of the campaign garners the best results.
A/B testing can vary in complexity and simple A/B Tests can include sending multiple subject lines to test which one generates more opens, while more advanced A/B testing could include testing completely different email templates against each other to see which one generates more click-throughs.
If you’re using email tools like Campaign Monitor, A/B testing your campaigns is really easy. You can use the email builder to create 2 different variations of your email, and Campaign Monitor will automatically send it to 2 different subsets of your list to see which variation performs best. Once the test has concluded and the winning version has been found, it’ll automatically send the winning version to the rest of your list.
 
A/B testing your campaigns is a great way to increase the open and click-through rates of your emails. Here at Campaign Monitor, we’ve tested everything from our subject lines to the copy on our call to action buttons . We’ve even tested different templates against each other to see which one works best, and were able to get a 127% increase in click-throughs as a result .
Chapter 2
What should I test?
Using email marketing tools like Campaign Monitor, you can test almost any aspect of your email marketing campaigns to improve the results. Here are a few ideas to help get you started:
Subject Lines
The subject line is one of the most prominent elements of your campaign when viewed in the inbox. On most devices, the subject line is formatted with darker, heavier text in an attempt to make it stand out among the other details of the email.
Given its prominence in the inbox, its effect on open rates, it should be an area of focus for your A/B testing.
So what about your subject lines can you test to try to drive increased opens? Here are a few ideas:
Length
The idea length of email subject lines is a hotly debated topic in email marketing and a recent study from Return Path shows that the optimal length is around 61-70 characters.
However, your subscribers are unique and they may react differently than those included in this study. Perhaps they are reading emails on mobile devices more, or are using older email clients which show fewer characters in the subject line field.
The only way to know the ideal length of your email subject lines for your audience is to test it, so consider setting up your next campaign as an A/B test to see what subject line length works best for your audience.
Word order
The order in which you place the words in your email subject line can make a difference to how people read and interpret them, and can potentially impact your email open rate.
Consider these two example subject lines for the same email:
Use this discount code to get 25% off your next purchase
Get 25% off your next purchase using this discount code
In the second variation, the benefit of opening the email (getting 25% off the next purchase) is placed at the beginning of the subject line. Given that English-speaking subscribers read left to right, this places the emphasis on the benefit readers will get from opening the email and can potentially increase open rates.
So next time you’re writing the subject line for your email campaign, consider testing the order of the words to see if front-loading the benefit can help improve your open rates.
Content
If your email contains multiple pieces of content (like a newsletter for instance), then testing different pieces of content as the subject line can be a great way to improve your email open rates and learn what kind of content resonates with your audience.
Campaign Monitor customer BuzzFeed does an excellent job of this in their newsletters.
Each newsletter BuzzFeed sends contains multiple pieces of content, and they A/B test featuring different content pieces in the subject line to see which drives the most opens.
So next time you’re writing a subject line for your email newsletter, consider testing different pieces of content in the subject line to increase the number of opens your campaign receives.
Personalization
According to our own study on Power Words in Email Subject Lines , the subscriber’s name is the single most impactful word you can add to your subject line, increasing opens by over 14%.
Converse , a Campaign Monitor customer, uses the subscriber’s name to great effect in their email campaigns.
By using Personalization Tags to dynamically insert a subscriber’s name into the subject line field, they help their campaign stand out in the inbox and drive increased opens, click-throughs and sales for their brand.
But would this same tactic work for your own brand and subscriber base?
The only way to know is to A/B test it on one of your upcoming campaigns. If you have your subscriber’s name stored in your email list, consider adding it to the subject line of your next campaign to see if it drives increased opens and clicks for your business.
Visuals
Research shows that the human brain processes visuals 60,000 times faster than text , which means using images in your email campaigns can be a powerful way to get your message across.
Campaign Monitor customer Bank of Tennessee knows this well and uses images effectively in their customer campaigns.
Alongside the copy informing subscribers of the new mobile deposit features, they include beautifully designed visuals that show the feature in action and make it easy for subscribers to understand what the new mobile app can do.
But should you be including visuals in your email campaigns? And what kind of visuals will work best?
The only way to find out is to A/B test. There are a number of things you can test, so to help you get started we’ve compiled a few ideas:
Images or no images
While images clearly have a positive impact on Bank of Tennessee’s email above, it might not necessarily be the case for your emails depending on the design & images chosen.
In fact, when SitePoint tested images in their newsletter they actually saw a very slight decrease in conversions , as they found the images distracted people from the content.
So while using images may work for some emails, it can sometimes detract from others and the only way to know whether they are working for you is to A/B test including them in your campaigns.
Style
There are many different types of visuals you can include in your email campaigns.
To illustrate, consider these two emails from different Campaign Monitor customers, both of which are using email to announce new features in their products.
Freshbooks chooses to use a drawing style in their campaigns that closely mimics the visual style seen on their website, while SmugMug opts to show a screenshot of the interface displayed inside a Mac laptop.
Which works best? It completely depends on your brand, audience and layout of your campaign, and the only way to know which will work for you is to A/B test it.
So next time you are creating an announcement campaign, consider AB testing the style of images you include to see which works best for your business.
Copy
According to research by Microsoft , smartphones have left humans with such a short attention span that even a goldfish can hold a thought for longer.
According to the study, the average human attention span has fallen from 12 seconds in 2000 – around the time the mobile revolution began – to eight seconds today.
We tested this ourselves a little while ago, and found that using positive language increased our email conversion rate by 22% .
So next time you are writing copy for your email marketing campaigns, give some thought to the tone in which you are writing and consider testing whether a positive tone could outperform a negative tone when it comes to driving click-throughs and purchases.
Calls To Action
Your calls to action are one of the most important parts of your email marketing campaigns.
They help increase your email click-through rate by making it clear to readers exactly what the next step is.
Campaign Monitor customer Sephora  uses them well in their email campaigns, including a prominent call to action that ensures readers know exactly what they need to do next.
Given the importance of calls to actions in driving click-throughs, it’s a good idea to A/B test them to make sure you’re getting the best results.
But what should you test? Here are a few ideas:
Button vs. Text
When it comes to creating calls to action in your email campaigns, there are generally two options: Adding buttons, or using simple hypertext links.
In our own testing, we’ve found that using buttons is a better approach, and we were able to get 27% increase in click-throughs by using a button instead of a text link.
However, this isn’t necessarily applicable for all campaigns. For instance, SitePoint’s newsletter gets an amazing click-through rate from using simple text links.
So which approach is best for your campaigns?
The only way to know is to A/B test it. Tools like our own email builder make it super easy to add both text links and buttons to your email campaigns and enable you to easily test which one works best for your unique campaigns and audience.
Button Copy
Regardless of whether you choose to use buttons or simple text links, you must also carefully consider the copy you actually use for those buttons and/or text links, as it can have an effect on the number of people who click-through from your campaigns.
In our own testing, we found that using specific, action-oriented copy such as ‘Get the formulas’ was better than using generic copy such as ‘Read more’, improving our email click-through rate by over 10%.
Although specific, action-oriented copy may have worked well for us in the test, that doesn’t necessarily mean it’ll have the same effect on your campaigns.
So next time you are creating a campaign for your business, consider testing generic button copy against specific, action-oriented copy to see which works best for your audience.
Chapter 3
3 tips for running more effective A/B tests
Tools like Campaign Monitor, with their drag-and-drop email builders, make it really quick and easy to run A/B tests on your email campaigns. No longer do you need to code multiple versions of the email and test them across different devices and email clients, you simply make the changes you want and click send.
However, before you dive in and start setting up A/B tests, there are a few strategic tips you can use to help increase the chances of getting success from your A/B testing.
1. Have a hypothesis
In order to have the highest chance of getting a positive increase in conversions from your A/B test, you need to have a strategic hypothesis about why a particular variation might perform better than the other.
The best way to do this is to come up with a basic hypothesis for the test before you begin. Here are some examples to help illustrate what a basic hypothesis might look like
We believe personalizing the subject line with the subscriber’s first name will help make our campaign stand out in the inbox and increase the chance it will get opened.
We believe using a button instead of just a text link will make the call to action stand out in the email, drawing reader’s attention and getting more people to click-through.
These simple statements, even if just said in your mind, help you define what you are going to test and what you hope to achieve from it, and help keep your A/B tests focused on things that are going to get results.
2. Prioritize your A/B tests
Between subject lines, button colors & copy changes, it’s likely that you’ll have a lot of ideas for A/B tests that you want to run.
However, not all A/B test ideas are created equal and you want to prioritize the ideas that are most likely to get you the best results with the least effort.
To help you do this, you can use the ICE score . Created by influential marketer Sean Ellis, the ICE score is a way to grade your different A/B test ideas and prioritize which ones to run first.
The ICE score has 3 parts:
Impact – How big of an impact do you think this might have? For instance, is testing a small change in your subject line likely to have as big of an impact as testing the tone of your copy?
Confidence – How confident are you that this change will have a positive impact? For instance, testing proven tactics like personalizing the subject line are more likely to have a positive impact on conversions than changing the image style in your campaigns.
Ease – How easy is it to implement this A/B test? For instance, testing the word order of your subject line would take less than 30 seconds, whereas testing different image styles requires you to create multiple images with different styling and will generally take longer.
For each A/B test idea you have, quickly (even just in your head) assess it against the 3 elements of the ICE score to help you grade each idea and prioritize which ones you should execute on first.
3. Build on your learnings
Unfortunately not every A/B test you run is going to result in a positive increase in conversions. Some of your variations will decrease conversions, and many won’t have any noticeable effect at all.
The key to this is to make sure you learn from each A/B test you run, and use that knowledge to create better campaigns next time.
Campaign Monitor customer SitePoint does a good job of this with ‘Versioning’, their daily newsletter that goes to over 100,000 subscribers.
For an entire week they ran different A/B tests on the newsletter, changing the template, testing images, testing fonts, etc.
Not every test they ran resulted in a positive increase in conversions (for instance, adding images to the email actually decreased conversions), but with every test they learned more about what works and what doesn’t work for their audience, and by the end of it they’d come up with a new email design that generated them a 32% increase in conversions.
Chapter 3
In conclusion
Tools like Campaign Monitor, with their drag-and-drop email builders, make it really quick and easy to run A/B tests on your email campaigns. No longer do you need to code multiple versions of the email and test them across different devices and email clients, you simply make the changes you want, click send and view the results when they come in.
So get started A/B testing your email campaigns today. Create a hypothesis for how you could improve your campaign, set it up as an A/B test, click send and see what happens. Best case scenario you’ll get an increase in opens or click-throughs, and if not, you’ll learn something about your audience that can help you create better campaigns in the future.

Images Powered by Shutterstock