It seems every article I read on “top email marketing tips” or “email marketing best practices” extols the virtues of A/B testing. Always be testing seems to be the mantra in many marketing departments. I’m about to make a bold statement that many may disagree with: Most marketers, especially in the SMB space, shouldn’t be spending resources running email A/B tests.

Don’t get me wrong, testing is important in email. We do quite a bit of it ourselves! But over the past few years, I’ve encountered many marketing teams that treat A/B testing like another box to check rather than implementing tests in a way that brings real gains.

Many teams test on audiences too small to achieve statistically significant results. Other times the goal of the test is not clear. A/B tests are not very valuable if the results are inconclusive, or if they don’t lead to results you can take action on.

Why are you testing? Many marketers would say “to improve,” or “learn what works and what doesn’t.” In order to get better results over time, you have to have a hypothesis, test it, and then track the results. But when you dig a bit deeper it becomes clear that the goals and results from many tests are not defined. Marketers creating variants simply to check the box is a waste of time and mental energy.

“Yes, we tested! Email 1 had a statistically insignificant win over email 2!”

Marketing bloggers love to write stories about how changing one element on a landing page, say the color of a button, the size of a font, or type of image, created a game-changing result. But if you talk to most conversion rate optimization specialists, they’ll tell you that typically results do not come easily.

Typically teams run many tests that have inconclusive results before finding a big winner. Other times, there’s never a single big winner. Instead, incremental gains add up to a major gain over time.

With a landing page or digital ad, you can test one element at a time and build on your results.

Maybe you start by testing two headlines, the first test is inconclusive so after a few weeks, you try again. After round two, you find a winner. You can keep repeating this process and slowly hone in on the ideal formula for your audience. Eventually, you’ll have a page with better performing headings, body copy, images, etc.

This brings us to the unique challenge of A/B testing email. A landing page continues to deliver better and better value as you improve each element. With email, and especially with a traditional email blast, you only see the winner after the fact. You learn what worked, but without a clear understanding of why it worked, you don’t know what will work next time.

In addition, when A/B testing email you introduce another variable that can bias results: send time. The way most email A/B testing software works, a certain number of emails are sent out to gauge response and the rest go out after a result has been found or a set amount of time has passed. This means either the test cohort or the remaining audience is going to get their email at a less than ideal time. Send time can have a significant impact on opens, clicks, and conversions. In some cases, the win gained by sending a slightly higher-performing email may be erased by the fact that the blast went out at a less than ideal time.

Setting aside the technical challenges of A/B testing email, there’s an even bigger reason I’d advise many marketers to focus on other ways to improve their email program. Based on our experience, most marketers are leaving huge wins on the table.

Many companies have inboxing issues and don’t know it, sometimes effectively only reaching half their audience. Other companies don’t authenticate their emails or use basic personalization. These companies would be far better off fixing these deficits than investing resources in testing.

A/B testing is by nature a blunt instrument. You’re trying to find a single message that works for a large audience. Investing in better segmentation, audience data, or personalization, on the other hand, allows you to achieve much more granular control and not compromise on your messaging.

We could list off many tools that will help you enrich data, personalize content (and don’t forget send time!) but the first place to start is using the basic tools in your existing marketing platform.

So when should you start A/B testing? Well, if you can clearly answer the question of why you're testing, what you're testing, and how you’ll measure results you’ll be in a much better place to start.

 

Mike Donnelly

Written by Mike Donnelly

Founder and CEO - Seventh Sense