We examine how technology itself can present barriers to conversion.
The architecture we use to manage our business may not be the same structure we should use to market it. Our business is to serve our customers, make it easy for them to get what they want, and create an image of class and competence that elicits goodwill.
EDITOR’S NOTE: This is a Research Brief, presenting the essence of new MEC research findings. These findings were also presented in a live Web clinic, which included commentary from the researchers. You can download an audio recording of the clinic at:
Many online services offer a free trial as their primary incentive to attract new subscribers.
Generally, a free trial offer works well.
The research for this brief began with the question:
How does customer anxiety impact conversion?
As marketers, we pretty much do whatever we can to get consumers to respond to our offers, pull out their credit cards, and buy (or at least try) what we are selling.
Even a small increase in the number of visitors that complete the registration/sign-up/buy process can mean millions of dollars in additional revenue.
We try everything and anything, but as this recent 60-day experiment with a leading subscription website showed, sometimes we seem to get in our own way.
How the wrong technology architecture reduced revenues by 22%.
It sounds strange at first, but often, technology gets in the way of online business. As a company we can become so fixated on the platforms, the metrics, and the hum and whir of our servers that we lose sight of what is really important. Sometimes, the architecture we need to manage our business is really not the same architecture we need to effectively market our business.
Running a scalable, secure, integrated web infrastructure is very challenging. And, a good CTO does not want to have to change order processes every week. However, because marketing is testing, marketing departments often ask the technology department for greater flexibility and agility. What we may need to do is create some type of redundant or mirror testing system where we can perfect our websites and micro-test.
We forget that the number one objective of a website is to allow a customer to more easily buy from us.
During the surge of Web 1.0, many of us spent thousands, even millions of dollars on sophisticated platforms and redundant, scalable databases. We thought that these complex systems were the key to growing our online businesses. Somewhere along the way, however, we forgot some basic principles of marketing.
If we look at MEC’s conversion index,
C = 4M + 3V + 2(I-F) – 2A
it illustrates that the probability of conversion (C) is a function of the buyer’s motivation (M), how strong our value proposition (V) is, and the combination of friction (F) and incentive (I) elements that make up our registration process, further mitigated by any anxiety (A) experienced by the customer during the process.
But, this representation almost assumes that our customers interact with our websites in a sort of vacuum, free of bugs, database exceptions, lost emails, server timeouts, and changed email addresses.
These small anomalies, however, may be more than just little details.
In one example, these anomalies cost $101,250 in 60 days.
During a 60-day test, we manually processed each transaction for one of the top paid subscription websites in the world.
We created a mirror of the live site. Our version used only simple form post functionality. There was no live interaction with the main database, and for each order of the target product, we simply captured the customer’s information in a “dummy” form.
Minutes after the order was placed by the customer, our customer service reps would manually enter the customer’s order into the database, using the live production website. They simply signed up on the customer’s behalf.
We had suspected that a portion of the users were unable to complete their registration due to a credit card issue, an existing account already on file, or some other server or technology error. We underestimated the extent of these issues:
This is what we found:
60-Day Test Results | ||
---|---|---|
Totals | Revenue | |
Total Successful Registrations | 1,001 | $450,450 |
Caught in a Registration Loop | 225 | $101,250 |
What you need to understand: More than 22% of all customers who wanted to order, could not, due to a technical issue.
There was nothing wrong with the code. The site was built on world-class technology, thoroughly tested and 100% functional. The technical issue, in this case, was how the system was originally designed. When we looked into it, we found that it was not that the system was poorly designed—just that it was designed like many of our infrastructure systems; namely, for management of our business.
So what was happening? What prevented 3.75 customers per day from successfully completing the registration process?
This 60-day test was with a major publisher selling information via subscriptions. Their registration process was straightforward and included good follow-up. The offer page had been optimized, and flowed into a 3-step order process. However, the page also included a free membership offer and thus led to another level of content. At this time, most new paid subscriptions originated through the free trial offer.
The free trial offer required email address and credit card information up front, and participants received additional product information during the trial period. The delayed revenue represented by the trial offer was about $450K, about half of which originated through our dummy form. However, about 22% had user name, password or other issues that prevented them from successfully signing up. These problems stemmed from business rules designed to make management of customer records easier.
Because we were manually processing orders through the paid subscription interface, we were able to see exactly what customers had been experiencing when they did it themselves.
We found 3 types of errors:
- Credit Card. The potential customer had an invalid credit card or some sort of bank hold on their card and was not clearly presented with a way to solve the problem.
- Password/Username. The potential customer had previously registered for a free newsletter but could not remember their password or username.
- Expired Account. The potential customer had previously taken a free trial and was not “eligible.”
Some of the customers were eventually able to sign up, but only after live interaction with experienced customer service reps. The majority were lost forever and never returned—even to try to correct the problem.
To address these problems, we removed all card validation edit checks, except ensuring the correct number of digits. We discovered that about two to three percent of orders had invalid card numbers. Whether they were typographical errors or fraud attempts is uncertain.
Whenever we encountered a problem, a customer service representative would call the customer to complete the registration.
With an average lifetime customer value of $450 to $500, the recovered revenue more than covered the customer service wages plus a $10 incentive for each saved order. Perhaps the most valuable factor is the goodwill gained by personal care.
What you can do to solve technology barrier problems for your own site:
- Place an order yourself. You might be surprised what you will find.
- Call your customer service line and pretend you are having an issue.
- Use the “forgot password” feature to recover an account. What happens? Better yet, have someone unfamiliar with your site do it.
- Place an 800 number on each page and carefully record the types of issues customers are having. Are there any patterns?
- Study where you lose most customers during the registration process.
- Examine closely what types of credit card failures you get (this is available through your payment gateway).
- Take away all data checks, server-side, or javascript validations and just let people order, with whatever data they provide. What happens to your conversion rate? How many bogus orders do you actually get?
How inaccurate reporting nearly cost us thousands in lost advertising.
Registration errors are not the only source of confusion and lost revenue. Many times we simply have the wrong information.
Our web metrics are perhaps the most important numbers we look at each day, as they tell us how our campaigns are working. Rarely, however, are our numbers actually accurate.
Often, we feel lucky to even have metrics and reporting. Our developers sometimes seem to keep these juicy nuggets hidden behind complex SQL statements.
In a recent test with one of the world’s leading online newspapers, we placed 3 hidden types of tracking pixels on our registration flow to measure a Paid Search campaign.
15-Day Period – Offer “A” Visitor Tracking | ||
---|---|---|
Tracking Mechanism | Total Unique Visitors | Conversion Rate |
Google Analytics | 948 | 17.3% |
In-house Tracking | 8,540 | 1.9% |
Server Logs | 1,016 | 16.1% |
Google Adwords | 3,522 | 4.7% |
What you need to understand: 4 different tracking and reporting mechanisms reported conversion as low as 1.9% and as high as 17.3%.
All we wanted to know was how many people came to our test site and how many purchased. We wanted to see how well paid search would convert on this particular product.
At first glance, using our in-house cookie-based tracking, we found that out of 8,540 visitors, 164 purchased—a conversion rate of 1.9%.
When we looked deeper, however, we found that some of the alternate tracking systems were reporting much different numbers. Google Analytics showed only 948 unique visitors to the same page during the same time period.
At first, we thought this was probably wrong, as it was giving us a 17.3% conversion rate.
Google Adwords, the source of our traffic, was showing more than 3,500 clicks to our site, so we assumed 948 must have been a large understatement.
What we found surprised us.
When we looked at the server log files (a tedious process), we started to see a large number of hits from servers in India and another large group from a company called ScanAlert, which we had recently engaged to scan our site for security breaches.
Because each visitor created a unique session, our in-house tracking system was counting every one as a unique visitor.
Google Adwords was filtering out the Indian servers, as suspected click fraud.
Our server logs were picking up the visits but counting only the actual unique visits, with the exception of the click fraud traffic, which the fraudsters seemed to be able to cloak and trick our servers.
Determined and weary, we broke the server log files down and took out any hits from a ScanAlert.com domain and the portion of hits from India that were invalid.
When everything had been re-counted we found that 2 of the sources were reporting a similar number:
Tracking Mechanism | Total Unique Visitors | Conversion Rate |
Google Analytics | 948 | 17.3% |
Server Logs | 976 (adjusted) | 16.8% |
What you need to understand: After taking out bogus data, we found 2 of the data sources to be nearly identical.
Had we simply depended upon the original data, we would have based our PPC budget estimates on a 1.9% conversion rate rather than a 17.3% conversion, which is 800% higher.
If we had not carefully studied the data, we would not have known that we could have spent more than $4 per click, rather than our original maximum cost per click of 51¢.
What you can do to solve this issue at your site:
- Install Google Analytics. It’s free and easy:http://analytics.google.com
- Look at your log files and compare them to your core site tracking. Are the numbers close?
- Make sure you understand the difference between unique visits, page views, sessions, and unique visitors. Be consistent when comparing data.
Identifying these types of technology issues can be a difficult process. Often, we are bound as marketers by what our website “needs to do” to successfully interact with the back-end of our businesses.
Frequently the CTO has more control over the website design than we do. A website should serve our business objectives, not our databases. Databases are worthless without the customers that fill them.
Remember, the number one objective of a website is to serve our customers and make it easier for them to purchase from us.
If we can’t tell how the website is being used or if we are unsure of the integrity of the data we are looking at, we can not successfully do our jobs as marketers. If we do not know that our websites are frustrating potential customers or if we are unsure of the integrity of the performance metrics we are looking at, we run the risk of both the immediate loss of revenue and an enduring erosion of customer goodwill.
As part of our research, we have prepared a review of the best Internet resources on this topic.
Rating System
These sites were rated for usefulness and clarity, but alas, the rating is purely subjective.
* = Decent | ** = Good | *** = Excellent | **** = Indispensable
- Dirty Data Equal Bad Bids ***
- Five Ways Technology Can Boost Profitability ***
- Flash, AJAX, Usability, and SEO ***
- Ajax: The New Web Interface Design Development Approach Everyone Talks About**
- Web Development – An Overview **
Credits:
Editor — Frank Green
Writer — Jalali Hartman
Contributors — Flint McGlaughlin
Adam Lapp
Bob Kemper
Michael Clowe
HTML Designer — Cliff Rainer