In July, I wrote the blog post, Email Marketing Research: 7 steps for successful email marketing testing and optimization. In it, I discussed how continuous experimentation is the quickest path to peak performance. It enables marketers to go beyond best practices to learn what works for their organization and, more importantly, their customers.
I’m preaching to the choir, right? Well, I also encouraged readers to take the annual email benchmark survey conducted by MarketingExperiments’ sister company, MarketingSherpa.
Thankfully, this blog’s readers, along with more than 2,700 other email marketers, participated in the study. In appreciation, I would like to share with you the current state of email marketing testing practices.
Email testing on the rise
The number of marketers who routinely test email campaigns rose 3% from 2010 to 42%. This is good news as the industry inches closer to making it a prevailing practice.
Unfortunately, nearly six of 10 email marketing budgets do not have any money earmarked for testing and optimization. The majority (63%) of tests are conducted by employees for whom the practice is a part-time and secondary job responsibility, but still a formal part of their job description.
The minority includes the 23% of the email researchers who report the task as their primary focus and full-time duty and the 19% of marketers who perform experiments on the side without it being listed in their job description.
Testing practices most routinely performed
This information may help benchmark your programs and processes against the industry. But for this year’s survey, we wanted to delve deeper into which formal processes and guidelines organizations routinely use to test and optimize email campaigns. Here is a look into what we found.
Chart: More time needed for brainstorming and defining the testing objective
How routinely does your organization implement the following testing practices?
The above chart displays common testing practices in chronological order from top to bottom. We asked marketers to share with us which tasks their organizations routinely execute. The survey uncovered organizations are spending the most time segmenting their lists, understanding the impact of the test on the entire funnel, and documenting their findings.
Segmenting is an effective and common practice
Taking the time to segment a list to target a specific audience is a requirement. Universally testing across the board will muddy the results. The respondents will inevitably give you data that pulls you in all different directions.
MarketingExperiments’ Research Partners have seen tremendous results when focusing on the “highly motivated and loyal subscriber” segment. For example, in a 60-day experiment, the research team discovered increasing from four email messages per month to 15 per month tripled the monthly revenue without any significant negative impact on unsubscribe or open rates.
Marketers often document effect of tests on the funnel
In addition, 43% of marketers in our study routinely document the impact of the test on the Marketing-Sales funnel. This is also a critical step in ensuring you understand the cause and effect of the experiment. A lack of complete information can quickly turn a success into a failure and jeopardize your brand’s success.
More time needed for brainstorming and defining the test objective
Where marketers can devote more time is to brainstorming optimization opportunities, identifying the key metric and reviewing the tests to decide on follow-up actions. Defining the question or key metric and reviewing the test is where the essential learning happens.
Remember the goal of the test is not to get a lift, rather a discovery. After all, if you just “luck into a lift” without knowing why you got it, you can’t replicate that tactic across your campaigns.
To create a solid foundation for a test, marketers must properly identify the research question, key performance indicator and test objective with clarity. To accomplish this step, try adapting the George T. Doran’s business mnemonic device for setting goals – S.M.A.R.T. – to email experiments. Keep in mind your email’s business objectives and how it can translate into a subscriber’s action (i.e., conversion).
Specific – Succinctly state the goal of the test for the entire team to understand. For A/B split tests, most brainstorming session questions start with “what” or “why” and the finished questions begins with the word “which.” Select only one variable or general element to test. Examples include subject line, headline or call-to-action.
Measurable – The element chosen must be one that can be measured, ideally throughout the entire sales process. This may go beyond your email metrics and be extended to website analytics or financial databases. Defining the primary metric that tracks a reader’s action will allow your team to decipher which treatment performed best.
Achievable – The question must be able to be answered based on the segment of the audience you will be testing. This will be determined by the firm’s ability to distinguish customer personas, behavior and sources (e.g., newsletter sign up, current customer, frequent buyer) within its email database.
Relevant – The goal of an email is to earn a click, not a sale. Understanding the placement of the testing objective in the subscriber’s thought sequence will allow you to choose the correct testing element that earns a micro-commitment from the reader, enabling her to move on to the next element of the email. For example, you may choose to test the headline to improve the transition from the “from:” field and subject line to the email body copy.
Timely – When formulating the key question, objective or metric, begin to think about external factors that could impact the testing results. Part of the testing process is to measure the email’s effectiveness at certain time intervals. A holiday or long weekend could influence open rates to the point where a research team may elect to extend the time it collects data.
An adept research question may read, “Which of these three email copy samples will result in the highest shopping basket recovery rate?”
To learn more about email testing and optimization benchmarks, click here to instantly download MarketingSherpa’s free 2012 Email Marketing Benchmark Report excerpt.
MarketingSherpa 2012 Email Marketing Benchmark Report – Launch Special: Save $100 (Offer ends Nov. 30)