If you’ve read MarketingExperiments for any length of time, you know that most of our marketing experiments occur online because we view the web as a living laboratory.
However, if your goal is to learn more about your customers so you can practice customer-first marketing and improve business results, don’t overlook other areas of customer experimentation as well.
To wit, this article is about a MECLABS Institute Research Partner who engaged in call center testing.
Overall Research Partnership Objective
Since the Research Partner was a nonprofit, the objective of the overall partnership focused on donations. Specifically, to increase the total amount of donations (number and size) given by both current and prospective members.
While MECLABS engaged with the nonprofit in digital experimentation as well (for example, on the donation form), the telephone was a key channel for this nonprofit to garner donations.
Call Script Test: Initial Analysis
After analyzing the nonprofit’s calls scripts, the MECLABS research analysts identified several opportunities for optimization. For the first test, they focused on the call script’s failure to establish rapport with the caller and only mentioning the possibility of donating $20 per month, mentally creating a ceiling for the donation amount.
Based on that analysis, the team formulated a test. The team wanted to see if they could increase overall conversion rate by establishing rapport early in the call. The previous script jumped in with the assumption of a donation before connecting with the caller.
Control Versus Treatment
In digital A/B testing, traffic is split between a control and treatment. For example, 50% of traffic to a landing page is randomly selected to go to the control. And the other 50% is randomly selected to go to the treatment that includes the optimized element or elements: optimized headline, design, etc. Marketers then compare performance to see if the tested variable (e.g., the headline) had an impact on performance.
In this case, the Research Partner had two call centers. To run this test, we provided optimized call scripts to one call center and left the other call center as the control.
We made three key changes in the treatment with the following goals in mind:
- Establish greater rapport at the beginning of the call: The control goes right into asking for a donation – “How may I assist you in giving today?” However, the treatment asked for the caller’s name and expressed gratitude for the previous giving.
- Leverage choice framing by recommending $20/month, $40/month, or more: The control only mentioned the $20/month option. The addition of options allows potential donors to make a choice and not have only one option thrust upon them.
- Include an additional one-time cause-related donation for both monthly givers and other appropriate calls: The control did not ask for a one-time additional donation. The ongoing donation supported the nonprofit’s overall mission; however, the one-time donation provided another opportunity for donors to give by tying specifically into a real-time pressing matter that the nonprofit’s leaders were focused on. If they declined to give more per month for financial reasons, they were not asked about the one-time donation.
To calibrate the treatment before the experimentation began, a MECLABS researcher flew to the call center site to train the callers and pretest the treatment script.
While the overall hypothesis stayed the same, after four hours of pretesting, the callers reconvened to make minor tweaks to the wording based on this pretest. It was important to preserve key components of the hypothesis; however, the callers could make small tweaks to preserve their own language.
The treatment was used on a large enough sample size — in this case, 19,655 calls — to detect a statistically valid difference between the control and the treatment.
The treatment script increased the donation rate from 14.32% to 18.47% at a 99% Level of Confidence for a 29% relative increase in the donation rate.
The benefits of experimentation go beyond the incremental increase in revenue from this specific test. By running the experiment in a rigorously scientific fashion — accounting for validity threats and formulating a hypothesis — marketers can build a robust customer theory that helps them create more effective customer-first marketing.
In this case, the “customers” were donors. After analyzing the data in this experiment, the team discovered three customer insights:
- Building rapport on the front end of the script generated a greater openness with donors and made them more likely to consider donating.
- Asking for a one-time additional donation was aligned with the degree of motivation for many of the callers. The script realized a 90% increase in one-time gifts.
- Discovering there was an overlooked customer motivation — to make one-time donations, not only ongoing donations sought by the organization. Part of the reason may be due to the fact that the ideal donors were in an older demographic, which made it difficult for them to commit at a long-term macro level and much easier to commit at a one-time micro level. (Also, it gave the nonprofit an opportunity to tap into not only the overall motivation of contributing to the organization’s mission but contributing to a specific timely issue as well.)
The experimentation allowed the calling team to look at their role in a new way. Many had been handling these donors’ calls for several years, even decades, and there was an initial resistance to the script. But once they saw the results, they were more eager to do future testing.
Can You Improve Call Center Performance?
Any call center script is merely a series of assumptions. Whether your organization is nonprofit or for-profit, B2C or B2C, you must ask a fundamental question — what assumptions are we making about the person on the other line with our call scripts?
And the next step is — how can we learn more about that person to draft call center scripts with a customer-first marketing approach that will ultimately improve conversion?
You can follow Daniel Burstein, Senior Director, Content & Marketing, MarketingExperiments, on Twitter @DanielBurstein.
You Might Also Like