Analytics and Testing: An approach to the delicate balance of confidence and uncertainty

0

Numbers breed confidence. Sometimes, false confidence. So, it’s not only vital to understand what your test results are telling you, but the limitations of those results as well. This understanding should shape how you interpret and present results to your clients or senior management.

 

Consider the source

Your analysis is only as good as the data upon which it is based. Understanding the limitations of your data and the way it was pulled from databases will assist you in designing the most ideal analysis.

In the case of MECLABS’ website testing, this includes understanding how the various testing and metrics platforms define and record key metrics such as “visitors,” “visits” and “conversion.”

 

It’s not just what you’re saying…

One of the challenges in interpreting results in the role of a consultant is the constant struggle between projecting confidence in the reliability of those results and a scientist’s obligation to fairly portray the limitations that exist.

To help with this, when reporting results, I generally try to avoid any language that suggests total certainty.

Here are a few examples that come to mind:

“Always,” “Never,” and “Must”

I also try to avoid undermining confidence in my results by using words like these:

“Unknown” “Speculate” and the always present “But.”

The reason for this is while it is necessary to express healthy levels of doubt, there is a delicate balance in doing so without undermining a client’s or leader’s trust in your results. After all, they will likely be used as a basis for vital business decisions.

If there are key limitations to a particular conclusion, certainly it is important to mention alongside that portion of the presentation. However, detailed information on overall limitations including data validity issues and analytical methodology should be included as footnotes or an appendix, separating them from the main presentation of results.

How you phrase analysis also comes into play during presentations, when business leaders or clients usually ask difficult questions. You may have immediate answers for some of their inquiries, but not for others. Ultimately it is how you engage clients that makes a difference in how they perceive the results.

For example, if there is no way to directly answer a question with the data at hand, is there perhaps a proxy for that data or a directional indication given by other related data?

A response of “we have no way to know” is unlikely to satisfy doubts, yet a response of “we cannot measure that directly, but if I performed an analysis of X,Y or Z, we could perhaps learn related facts” is likely more reassuring.

In short, don’t forget it’s not just what you’re saying in your analysis, but how you say it that matters.

Know when to walk away

If the limitations of an analysis are so great you as the data analyst do not trust the results, it is your obligation to discontinue the analysis.

Presenting an unreliable or incomplete analysis misleads your audience. The trust they place in you as their data analyst is valuable and not to be taken lightly.

Certainly if an analysis is placed in context of its limitations so the audience understands how broadly or narrowly those results can be generalized, there may be value in presenting the results of an analysis based on incomplete data, but if the data is unreliable or internally inconsistent to the point where data cleaning cannot repair your trust in the dataset, there is no reason to continue analysis.

It may be just as valuable for your client to issue recommendations for improving their data collection and to recommend future analyses to be carried out on that future clean data.

 

Related Resources:

How to Predict, with 90% Accuracy, Who Your Best Customers Will Be

Marketing Metrics: Can you have one number to rule them all?

Marketing Optimization: You can’t find the true answer without the right question

You might also like

Leave A Reply

Your email address will not be published.