Testing and Optimization: What is the most valuable customer insight you’ve gained from a marketing test?
In Wednesday’s free Web clinic – Double the Value of Your Online Testing: Don’t just get a result, get the maximum customer insights – Flint McGlaughlin, the Managing Director (CEO) of MECLABS, will walk you through a recent experiment we ran that generated a 23.91% and conduct live interpretation of this test to help you extract knowledge from your own tests.
But first, we wanted to hear your thoughts about insights you’ve gained through testing. So we asked your peers. Here are a few of our favorite answers…
Be relevant, but don’t be a stalker
My company is in the business of producing personalized, data-driven communications. And time-after-time, we’ve seen that making communications relevant to the individual recipients using database information improves response and ROI.
However, we’ve noticed an interesting trend lately: Hyper-personalization sometimes has a negative effect on response rates. The cases we’ve identified have some commonalities:
1) They were acquisition campaigns to new prospects, and,
2) Personalization was not used subtly on the creative pieces, but emphasized heavily (almost as if the client was trying to impress the recipient by how much they knew about them).
Our interpretation is this: Personalization works great with current customers and warm prospects. They understand why you know as much as you do about them, and respond positively to the relevance of your messages. However, with an unfamiliar audience, in-your-face personalization can come across as presumptuous (eliciting a “You don’t know me!” response) or invasive (“Big Brother anyone?”).
In short, the lesson appears to be a pretty obvious one: When you’re meeting someone for the first time, be relevant, but don’t act like a stalker.
No, don’t click there!
Watching users click on what’s not a button or link, and watching them miss what I think is a perfectly good button or link has been eye opening.
For my team the biggest insight we gain is what the users’ intentions really are, and what questions they really have. By analyzing the whole funnel from the driver of traffic to the Web experience, we have been able to really pinpoint what the users’ needs are from a product and Web experience standpoint.
Learned so much through that one simple test
This is dated and traditional but was very helpful. While I was marketing money transfers outbound from the USA for Western Union, we ran a test in which we sent out two very different mailers to two sets of our best customers.
To one segment, we ran a hard offer of gifts for additional purchase.
To the other segment, we sent out a recognition mailer complimenting our customers for being so concerned about their loved ones back home.
A common control group was held back to track performance of each segment against (mailed nothing for monitoring period).
Intuitively, I expected the hard offer to outperform. How wrong I was and am I glad we tested!
The recognition mailer far outperformed the hard offer. What’s more, many recipients of the recognition mailer wrote back to us with glowing testimonials and useful insights about how we could improve our service. The feedback was alone worth a gold-mine!
Besides respecting my customers even more, I learned so much about customer loyalty management through that one simple test.
Ask for more
Really simple thing that we learned almost two decades ago that still holds true. When developing ask letters for contribution campaigns we tested having the smallest amount to the left going to the largest amount on the right and vice versa.
Test after test after test, we have found that when the largest amount to check off is at the left going to the smallest amount to the right, the contribution amounts go up at least one category per mailing.
As well, when you segment the mailings and vary the ask amounts based upon segmentation, you can increase your overall donor amounts significantly.
Big red no
Changes that work well with one client’s website sometimes make no difference when you try them on a different client’s site.
An easy example is a big red submit button – worked for site A and made no difference on site B.
The next great idea is out there waiting for you
The most valuable insight I’ve learned – it’s very easy to be wrong.
What I mean is it is very easy for any of us who have been in marketing for any period of time to assume we know what is best. We’ve studied our demographics, purchase behavior and customer feedback. We “know” which offer/creative/channel will be perform with our audience.
Why test? We know what has been working. Of course it will keep working.
Then you test something new, and amazing, you find out you didn’t know what you didn’t know. This may be the single most important insight.
Always test, test, test, the next great idea is out there waiting for you…
– Shade Wilson, Digital Marketing and Direct Response Executive