What “ugly duckling” landing pages can teach us


Ever tested an optimized landing page that followed all the right tactics, dramatically improved the page’s look and feel, got kudos on all the internal previews – and still got trounced by the ugly duckling control page?

ugly-duckling.jpg All Web marketers have been there. That’s why we test in the first place. But each time this situation pops up, it seems to throw us for a loop.

Today the MarketingExperiments team reviewed a test that fit this scenario precisely. The control page was a bare-bones, ultra-vanilla layout: white background, plain black text, bulleted copy and simple name and e-mail signup form. There was even a prominent typo.

The tested treatments were far more polished and professional, with masthead logo images, a few testimonials, images of client logos, copy with select phrases bolded, and different text on the “submit” button. An “Anti-Spam” medallion next to the short form was added to reassure registrants.

Remarkably, the conversion rate was nearly 14% for the bare-bones control, compared to a 7-9% range for the two treatments. We can’t show the pages, but here are a few takeaway lessons:

• Beware of copy revisions that significantly alter the offer. From headlines to “submit” buttons, it’s easy to underestimate the power of two or three words to tilt the needle the wrong way. Example: “Free Access” vs. “Send me Tips”.

• Not every hero shot will save the day. Images aren’t a slam dunk, whether they are of people or product. Using a relevant photo generally attracts the eye, and the right one can add a little sizzle to the page, but portraits and logos run the risk of being divisive. Example: Logos of your Fortune 500 clients could turn off some small companies and single-shingle prospects.

• Don’t count on an incentive to lift response. Thinking of adding a freebie to make your opt-in offer more appealing? Tread carefully. It might not hurt response, but it might not help either. In this test, the treatment offering a free report finished dead last. (Keep that in mind before you produce that next whitepaper you’re certain every prospect will love.)

While these pages are going back to the lab for additional testing, the early results helped isolate several elements that should make a big impact on subsequent rounds. It’s a good reminder that even tests that fail can improve our optimization efforts, and that we still benefit … even when the ugly ducklings win.

You might also like
  1. John-Scott Dixon says


    The ugly duckling has always driven me crazy, but results are results. I have two questions:

    1. Are the landing pages you created connected to an advertising or email campaign?

    2. Are you using a/b split or multivariate testing software to analyze the results? If so, from what company?

  2. Hunter Boyle says

    Hi John-Scott,

    Thanks for the post and good questions. You’re spot on: results are results, aesthetics be damned.

    The landing pages were tied to a PPC campaign, with a “radical redesign” A/B test to uncover the prime movers on the page.

    There’s actually much more to write about with this test — relevance of the PPC ads to landing page copy, the target audience and quality of leads being generated, how those leads are handled later in the funnel, etc. When we have more results, we’ll surely cover this case in more depth in an upcoming Web Clinic.

    If you’re receiving our MarketingExperiments Journal, we’ll let you know when we cover these tests in more detail.

  3. Scott Clark says

    In my experience the ugly duckling page is often replaced by the ‘designers’ without any thought to what will happen to performance. But that’s not the worst thing. When you tell some clients about the ugly duckling’s superiority in performance, many STILL will not let you put it back in place.

    This is the bane of my consultancy. How can someone stand in front of a group of people with an ugly webpage and a large conversion number and get ego-designers (who often HAVE NO measurements) to be quiet?

  4. Rob says

    Very good post.

    When we started testing, we thought we had a good eye and a good idea about what would work and why people would do certain actions.

    Boy were we proved wrong! Now we don’t even try to guess…we just upload it and test against the control. And yes, because of this, we now test everything!


Leave A Reply

Your email address will not be published.