People often dread tests because they don't want to fail. Failure means you'll have to take the test again, after you've spent even more time studying or working on your health.
Email testing is different.
The likelihood of failure is higher when you don't test. You won't get as many opens or clicks, and you definitely won't learn anything about your audience and how they view you, your content, and your organization.
Continuous testing and learning is needed for ongoing success in email marketing. Whether you’re using a simpler A/B approach, where two versions of copy or design (version A and version B) are compared to one another, or a multivariate testing strategy which compares a higher number of variables and reveals more information about how these variables interact with one another, the critical point is that you should always be testing.
ABT: Always Be Testing
Time of day and day of week are two variables that can make a significant difference in engagement
The most effective approach to email testing is to start at the “top,” with subject lines and preheaders. The preheader can be tested in combination with the subject line, as long as you always pair the same subject line with that preheader, making it one unit. You can also test the subject line and preheader independent of each other. For example, when taking an A/B testing approach, you could use one subject line for both sends and a different preheader for each (A vs. B). This allows you to understand which preheader made a difference in open rate.
Next in line is your call-to-action (CTA). The CTA should be specific to the topic matter. For example, if you’re inviting physicians to a CME event, “Register Now” is clear, concise, and accurate of what they can expect when clicking. You can also test the placement of CTAs so you learn which elicits the most activity. It’s also important to control other factors in this test, such as ensuring each CTA goes to the same landing page.
When it comes to content, there are many possible variables--and they stretch from the subject line and preheader copy to the closing signature. For one example, consider the headline. How long is the headline text? How large is it? Is it personalized? Perhaps you test asking a question versus making a statement. Other content variables include length of copy, tone of copy, whether or not the email contains graphics, how many graphics, and how large the graphics are.
You might even go as intricate as testing certain word choices against each other, such as “We’re excited to announce” versus “We’re eager to announce” in an email revealing a new addition to the health system.
Time of day and day of week are two variables that can make a significant difference in engagement. Conventional wisdom states that Tuesdays and Thursdays are the best time to send emails, but what we’ve seen through testing is that is not always the case. Your audience and segments are unique, and you might find there is a sweet spot outside of the “normal” sending times where you get more engagement. Testing should also factor in differing time zones. An email sent at 8:00 a.m. EST might annoy a still-sleeping doctor in California, so it’s a good idea to localize your deployments.
Use Results to Optimize Your Email Campaigns
Testing is only the first step—one that’s never actually done
Performing these tests is only the first step—one that’s never actually done. But, once you have a solid understanding of the results, you can identify how and why your audience responded most favorably and really key in on crafting messaging that speaks to specific segments.