A/B Testing: SAP increases conversion 62% by using images

9

Today at 2:00 p.m. EDT, I’ll be interviewing Shawn Burns, Vice President of Digital Marketing, SAP, during the MarketingSherpa webinar, “Testing: A discussion about SAP’s 27% lift in incremental sales leads.”

We’ll be discussing Shawn’s team’s impressive four-year effort to create a Test Lab within SAP, complete with the politics, change management and talent gaps involved in any truly transformative marketing initiative.

But first, right here on the MarketingExperiments blog, let’s take a look at one of SAP’s tests from this Test Lab initiative …

Background: SAP identified the “Spotlights” section on SAP.com as a key area at driving traffic to deeper levels of the website, where more topically specific content can be engaged.  This area was very text heavy.

Goal: Increase the level of engagement (as measured by clicks in the areas’ CTAs).

Primary Research Question: Does more of an “eye candy” approach drive more engagement?

Approach: A/B split test

Control Treatment

RESULTS

The treatment, with thumbnails and clearer calls-to-action, yielded 62% more engagement. The team was able to isolate the conversion impact to one variable – the imagery – because another treatment (not pictured in this blog post) included the enhanced CTAs without the imagery, yet still lost handily to the treatment with imagery.

 

Segmenting your audience

With any test, it’s important to consider different segments of your audience. After all, a test is simply measuring audience interaction with a treatment. Your segments may be based on age, job title or primary product interest.

In SAP’s case, one segment they were testing was location, based on country. In fact, the 62% lift is an aggregate across seven country websites, so at an aggregate level, they found the imagery to be effective. However, they also found visitors to the China version of SAP.com engaged at a level 433% higher than the U.S. version of SAP.com, giving the China team extra impetus to use imagery.

 

Running your own visual tests

I talked to Taylor Kennedy, Senior Manager, Optimization and Strategy, MECLABS, about running tests using visuals to increase conversion. If you want to run tests on the look of your website, Taylor suggests you think of imagery on two levels:

Using visuals to communicate value

How does the imagery relate to the value of the call-to-action, content or whatever you’re trying to generate interest in? And, is that imagery consistent with your branding and your primary value proposition?

“Can you associate imagery with the value of your content?” Taylor asked. “For example, if you write about testing, would you put a picture of an analytics report or a person perhaps? Would you use symbols or graphs? How do you communicate the value visually?”

And, you want that communication to happen quickly.

“Is it simple enough or does it confuse?” Taylor asked. “You don’t want imagery that takes away from your conversion goal, you want it to clarify the experience for the user.”

 

Using visuals to guide the visitor

Value communication is not the only purpose of visuals, however.

“If you’re increasing clickthrough just because it’s drawing the eye, that’s fine, too. You still improved your KPI,” Taylor said. “Even an arrow pointing at a call-to-action can have an impact.”

Simplicity is important when drawing the eye as well.

“You don’t want it to distract. You need to think, ‘How will these visuals help conversion in this instance?’ For example, can you use imagery to give people additional opportunities to click without confusing them?” Taylor asked.

Now that you have some ideas, start testing. As Taylor said, “Even on Facebook, images get more attention than text posts.”

 

Related Resources:

Testing: A discussion about SAP’s 27% lift in incremental sales leads – Wednesday, May 1, 2013, 2:00 p.m. EDT

Testing and Optimization: SAP’s Test Lab increases digital leads 27%, leads to 20% budget savings (Part 1)

Testing and Optimization: SAP’s Test Lab increases digital leads 27% [Part II]

Optimization Summit 2013 in Boston, May 20-23, 2013 – Shawn will present a transferable case study, “5 Optimization Discoveries from the SAP Website Test Lab”

This Just Tested: Stock images or real people?

Landing Page Optimization: Goodbye stock photos and Happy Man, hello social media

Testing: Go big, or go home?

You might also like
9 Comments
  1. Shock Marketer says

    I wouldn’t say that the stock photos added “value,” but they did draw attention and create emphasis. The “Learn more” call to actions also probably helped.

    If actual photos from the company were used, that would add value and add more significance. I think that would be a great test for them in the future.

  2. Jon Powell says

    I agree – the most interesting part of this case study is the fact that the images are stock and don’t really do much more than add a “cheap” visualization of what each article is focused on.

    This could be somewhat encouraging to marketers with limited time and budgets. Sometimes stock is the best they can do with what they have available.

  3. Atopos42 says

    I also agree. What is the effect of stock vs real photo’s. Furthermore the stockfoto’s do not seem to have any link to the articles. What would happen if you would use relevant photo’s.

    Another interesting question is the attention of visuals vs the attention of the titles of the articles. I have read reasearch that suggest that titles attract twice as much as the visuals (60,7% vs 27,9%). This research was done on an online newpaper.
    And as the titles of the treatment version fill two lines and the default fill one line. The titles in the treatment seem attract (to me) more attention.
    An other thing is the use of hairlines between the articles in the treatment version. This makes it easier to scan the different articles. And therefore has an impact on the test results. Has this been tested seperately?

    And last but not least: the place of the CTA’s. In the treatment this has been placed below the article. It is mentioned that another test has been done with the “enhanced” CTA’s and that the conclusion was that these did not cause the differce. But was also tested without the CTA’s below the article. Because the default has the CTA above the article (how strange).

    All in all, the doubble lined titles, the CTA’s below the article and the hairlines, give me enough doubt that the conclusions of this test are up for debate.
    Please proof me wrong 🙂

  4. Jon Powell says

    Yeah, without a very strict single factorial test with statistical significance, there’s always room for debate. Unfortunately, most marketers don’t have the traffic or buy-in to start with those.

    I can say, though, that this finding combined with the hundreds of tests I’ve done and have collectively seen here, these takeaways can be taken seriously. I’ve also discovered, too, that some changes can’t be separated if you want a certain “effect” (and result that comes with it)

    On titles attracting: did the research you read suggest what value type of the title headline variable? (Ie the content of the headline versus format)

    In my testing, I have discovered that headlines attract more from a content perspective. When comparing the two screenshots above, only one of the headlines have changed in terms of content. I could be wrong, I just haven’t seen it yet.

    On hairlines: we’ve tested this separately here and the only time we ever saw a significant difference is in e-commerce carts..where hairlines produced a significant decrease in response. Again, same concept as above. I could be wrong… Haven’t seen it yet

    On additional CTAs – Im going to trust the narrative provided in the article about the single factorial test they did to confirm it wasn’t that element.

    I’ve actually conducted some first-hand testing in content marketing that concluded the format of and display of the content had significantly less impact (not detectable statistically in the reasonable duration set out for the test) than the actual content itself. It helped the team know how to allocate their resources to find the next gains.

Leave A Reply

Your email address will not be published.