Start mammograms at 50, not 40. With this advice, the United States Preventive Task Force set off a firestorm of controversy questioning everything from its motivation to wisdom.
These recommendations, and the controversy that surrounds them, are just the tip of the comparative-effectiveness iceberg. For those not familiar with the term, you will hear it more and more in the near future. The federal government is investing $1.1 billion in comparative effectiveness research to find the most effective treatments for common conditions.
Does evidence change behavior?
Hard data about what works best sounds good in theory, but researchers are finding that evidence is only part of the story. Convincing the public to accept new medical guidelines takes more than numbers. As Christie Aschwanden explains in the latest issue of Miller-McCune, “When it comes to new treatment guidelines for breast cancer, back pain and other maladies, it’s the narrative presentation that matters.”
So what do these insights into human nature mean to the evidence-based marketer? While the power of the testing-optimization cycle is discovering what really works for your organization, this knowledge alone does not drive change. Beyond proof, you need a few good communication skills. To that end, here is some quick advice to turn test data into action…
Paint the picture
While detailed data is the lifeblood for any successful evidence-based marketer, make sure you can communicate both the forest and the trees. So before you make any presentation about the results of your testing-optimization cycle, take a few steps back. What is the story behind the numbers? What is your overall story arc?
It will likely be something along the lines of, “We conducted a series of tests to help improve our marketing. From these tests, we learned what works for us and what doesn’t. Now we can apply that knowledge across our enterprise, and by doing so, drive significant ROI.”
Make no mistake, the numbers matter. But make sure that they are only part of the story, not the main focus.
You succeed, we fail
People get defensive when you tell them that they’re wrong. So if you’re trying to convince a decision maker to change elements of a campaign that he developed, you will have to approach it strategically. The language you use to present these findings can go a long way to helping get him on your site.
For example, when your tests show a gain for an idea, credit him (when applicable). “Your headline delivered a 394% gain.” However, when your tests show that an element underperforms, share the blame. “The squirrels that we put on our website underperformed the optimized treatments by 203%.”
Accentuate the positive
Negative news tends to make people feel insecure, unsure, and even nervous. You’ve basically just dropped a problem in their lap.
So when possible, don’t dwell on the negatives you have uncovered with your marketing experiments. And directly after presenting them, point to the positive corollary that you’ve discovered with your research. “While images of squirrels have been hurting conversion rates, pictures of families have driven double-digit increases.” Always end on a high note.
Don’t just present the data. Include an action plan that shows how to put the findings into action. “We’ve identified the 27 places we want to swap out squirrels with families. Our design team has selected new imagery. Once I get your budgetary approval, we can have the changes done within 72 hours.” Every problem should have a solution.
Focus on the bottom line
Most business-level decision makers do not care about testing. Or unsubscribes. Or even conversion. They care about making money.
Make sure the data you present uses metrics that really matter to your audience. While intermediate metrics are very helpful to you during the testing-optimization cycle, bottom-line, results-oriented metrics will always be better at helping you gain the authority to drive change that you seek.
Not to belabor the obvious, but if you’re seeking to make changes based on the tests you run, make sure you’re right. In other words, don’t just rely on the numbers spit out by your testing platform. Technology doesn’t drive testing success. People do.
Approach your tests with a scientific methodology. And understand how and why your tests are statistically valid. Because in the end, the most believable evidence-based marketer is the one who got down into the trenches and helped create the evidence firsthand.
The Business Case for Testing: How one marketer convinced her business leaders to start testing and drove a 201% gain in the process
Focus Groups Vs. Reality: Would you buy a product that doesn’t exist with pretend money you don’t have?
Cost of Delay: How to win approval for your test and test schedule
Daniel, having read this twice I feel compelled to comment because the take-away lesson I get is very different from your title. Humans are sense-making machines — even woefully inadequate explanations will “make sense” to people if they’re presented right. And you nailed it here: it’s all about the narrative presentation.
The reason that troubles me is that, based on everything here, it seems like the “facts” remain mostly irrelevant to the task of convincing people. This reads like the real challenge is the narrative presentation — the actual numbers and facts are secondary. We’re free to pick and choose only the data points that support our case REGARDLESS of what that case is.
In other words…business as usual?
Thanks for your well thought out comment.
Sadly, you bring up a very valid point. There are many who choose to manipulate data (as Erica Beecher-Monas said, “[Like people] if you torture statistics long enough, they’ll tell you anything you want to hear.”) or pick and choose the data that agree with them. The segmentation that the web has wrought, where people can just search out channels that agree with them as opposed to reading traditional media that actually has an editor (the sommelier of information), has greatly exacerbated this problem.
That being said, I believe if you read my post a third time (and thanks for giving two reads, by the way), you’ll see that I am not writing this to the data receiver, but rather the data communicator. (I agree with you. Doubt and skepticism, along with simple good judgement and research abilities, are crucial to any data receiver). My blog post was aimed at the evidence-based (and ethical) marketer. The marketer (or really anyone) in search of true discovery. It takes more than real data to change action. Part of the reason may well be, as you point out, whoever you are communicating to has likely been exposed to so many “facts” (in quotations marks) as opposed to real facts.
That’s why one of the points I make is — Be Right. Scrutinize that data (don’t just trust technology) and make sure that you really have gained new, real knowledge. As a corollary perhaps I should also add, “share your methodology.” Understanding that a truly scientific and rigorous methodology underpins your discovery will help convince decision makers that you are not basing your recommendations on pseudo, junk science.
Beyond the problem with “facts,” there is that inherent flaw in human nature — cognitive dissonance. If there was a better way to do things, why haven’t I been doing them all along? The easiest dissonance reduction (on the ego, at least) is to doubt what you’re telling me, as opposed to my own actions.
Of course, doubt has its place, too. It is what protects us from the unscrupulous “facts” purveyors. But let’s not hesitate to turn that doubt on our own actions as well, and make sure we make decisions based on real data as much as possible, as opposed to our own preconceived notions.