Why are there so many different metrics for the same component of a website? For example, page views, visits, unique visitors and instances, just to name a few. Which metrics should you use? Are all of these metrics really helpful?
Well, they exist because these metrics are tracked differently and yield different interpretations.
However, much more knowledge can be gained by analyzing these metrics with respect to one another combined with usage of segments, or characteristics of users.
For example, if we look at the metrics for page views on your site with respect to number of visits between two browsers, such as Chrome versus Firefox, and then discover that Chrome users actually have more page views per visit on average than Firefox users.
That could indicate that the customers using Firefox may be of a different demographic with a different level of motivation compared to Chrome users. Or, it could also mean that the site functionality or user experience on Chrome could be different than Firefox. For example, I know on my own computer, Chrome displays website content in a smaller font than Firefox.
In today’s blog post, learn how just these two metrics combined together can put you in a detective mindset.
Don’t just investigate what customers do, investigate where they go
There are plenty of great free and paid analytics tools out there to help you investigate customer behavior, but for the sake of this example, I’m going to talk specifically about Adobe’s analytics tool.
An interesting advanced feature in the platform allows you to set custom tracking on link clicks on a page that when combined with other metrics, could reveal great findings.
For example, let’s say you have a call-to-action that takes a visitor to the cart. By using custom tracking, you can track how many visitors interact with the CTA on the page and compare that to the next page flow.
If you see a significantly higher number of visitors clicking the CTA, but not as many cart page views on the next page report, it could mean that there is some technical issue preventing the visitors from going to the next page.
There could also be a lot of “accidental” or “unintentional” clicks from visitors clicking the back button before the next page even loads, which can be very common on mobile sites.
If there are significantly less visitors clicking the CTA and more cart page views on the next page flow, what would that indicate?
Perhaps people are using the forward button frequently because they came back to the page after they have seen the cart.
Testing and optimization can help you eliminate the usual suspects
To build on this example, some additional metrics in Adobe Analytics, including “percent of page viewed,” give an idea of how much of a page on average visitors scroll down and see.
So, let’s say you have two CTAs, one at the top of the page and one at the bottom of the page.
Your data shows that most people are scrolling down the page but clicking the top CTA more than the bottom. This could have multiple implications you should investigate:
- Maybe people like to go back up and read more, so look to the average time spent on page as a key performance indicator in further testing.
- Another possibility is if the buttons are identical, then perhaps test your button copy as button testing is generally low hanging fruit for a quick win.
- If the buttons are already identical, consider testing the button placement to discover if location is having an impact on prominence.
Overall, these are only a few examples of combining different metrics to help you explore customer behavior on your website at a deeper level.
Ultimately, testing and optimization is the easiest way to eliminate all suspects in your particular case and truly solve the crime by finding out which elements in your customer experience are stealing conversion.
You may also like
Can I Test More Than One Variable at a Time? [Video]
Value Proposition: 4 considerations to identify the ideal channel for testing [More from the blogs]
Online Testing: 3 test options, 3 possible discoveries, 1 live test from Email Summit 2014 [More from the blogs]