Start mammograms at 50, not 40. With this advice, the United States Preventive Task Force set off a firestorm of controversy questioning everything from its motivation to wisdom.
These recommendations, and the controversy that surrounds them, are just the tip of the comparative-effectiveness iceberg. For those not familiar with the term, you will hear it more and more in the near future. The federal government is investing $1.1 billion in comparative effectiveness research to find the most effective treatments for common conditions.
Does evidence change behavior?
Hard data about what works best sounds good in theory, but researchers are finding that evidence is only part of the story. Convincing the public to accept new medical guidelines takes more than numbers. As Christie Aschwanden explains in the latest issue of Miller-McCune, “When it comes to new treatment guidelines for breast cancer, back pain and other maladies, it’s the narrative presentation that matters.”
So what do these insights into human nature mean to the evidence-based marketer? While the power of the testing-optimization cycle is discovering what really works for your organization, this knowledge alone does not drive change. Beyond proof, you need a few good communication skills. To that end, here is some quick advice to turn test data into action…
Paint the picture
While detailed data is the lifeblood for any successful evidence-based marketer, make sure you can communicate both the forest and the trees. So before you make any presentation about the results of your testing-optimization cycle, take a few steps back. What is the story behind the numbers? What is your overall story arc?
It will likely be something along the lines of, “We conducted a series of tests to help improve our marketing. From these tests, we learned what works for us and what doesn’t. Now we can apply that knowledge across our enterprise, and by doing so, drive significant ROI.”
Make no mistake, the numbers matter. But make sure that they are only part of the story, not the main focus.
You succeed, we fail
People get defensive when you tell them that they’re wrong. So if you’re trying to convince a decision maker to change elements of a campaign that he developed, you will have to approach it strategically. The language you use to present these findings can go a long way to helping get him on your site.
For example, when your tests show a gain for an idea, credit him (when applicable). “Your headline delivered a 394% gain.” However, when your tests show that an element underperforms, share the blame. “The squirrels that we put on our website underperformed the optimized treatments by 203%.”
Accentuate the positive
Negative news tends to make people feel insecure, unsure, and even nervous. You’ve basically just dropped a problem in their lap.
So when possible, don’t dwell on the negatives you have uncovered with your marketing experiments. And directly after presenting them, point to the positive corollary that you’ve discovered with your research. “While images of squirrels have been hurting conversion rates, pictures of families have driven double-digit increases.” Always end on a high note.
Don’t just present the data. Include an action plan that shows how to put the findings into action. “We’ve identified the 27 places we want to swap out squirrels with families. Our design team has selected new imagery. Once I get your budgetary approval, we can have the changes done within 72 hours.” Every problem should have a solution.
Focus on the bottom line
Most business-level decision makers do not care about testing. Or unsubscribes. Or even conversion. They care about making money.
Make sure the data you present uses metrics that really matter to your audience. While intermediate metrics are very helpful to you during the testing-optimization cycle, bottom-line, results-oriented metrics will always be better at helping you gain the authority to drive change that you seek.
Not to belabor the obvious, but if you’re seeking to make changes based on the tests you run, make sure you’re right. In other words, don’t just rely on the numbers spit out by your testing platform. Technology doesn’t drive testing success. People do.
Approach your tests with a scientific methodology. And understand how and why your tests are statistically valid. Because in the end, the most believable evidence-based marketer is the one who got down into the trenches and helped create the evidence firsthand.