Small PPC Search Engines Revisited

Can online marketers achieve a worthwhile return on investment with smaller PPC engines like Kanoodle, Miva and others?

We recently released the audio recording of our clinic on this topic. You can listen to a recording of this clinic here:

Small PPC Engines Revisited

Two years ago we conducted a study of small PPC engines to find out whether they offered online marketers a profitable opportunity to generate more sales.

At that time our research indicated that these smaller engines could indeed generate significant additional revenues.

Today, two years later, and with even more intense bidding for keywords on the major PPC engines – Google, Yahoo! and MSN – we wondered whether these smaller engines still offered a profitable source of additional traffic and income.

If so, the use of these smaller engines, in addition to the big three, could take a great deal of pressure off marketers who are facing diminishing returns with their current campaigns.

We tested seven small PPC engines with four different research partners.

While there were some interesting differences in performance between different partners and engines, the final results may surprise you.

In order to truly understand what the smaller PPC engines can and cannot achieve for online marketers we tested no fewer than seven engines with four different research partners.

The PPC engines included in our test were:

  • Enhance
  • Miva (Results may include Findology, ABC Search, 7 Search, Search Feed, FindWhat, and Others)
  • Kanoodle
  • Mamma
  • GoClick
  • AdBrite
  • Ask

The partners we tested represent the following business areas:

  • A Home Décor Site (Online retailer)
  • A Specialty Job Site
  • A Newspaper Site
  • A Child Safety Site

By working with multiple engines across a number of different industries our intention was to avoid drawing conclusions based upon industry or product market-specific attributes.

From the outset, we expected that the volume of traffic would be far below those of Google, Yahoo! or MSN.

This proved to be the case.

In addition, there were significant differences in reach among the small engines themselves. As a result, our findings are based principally on percentage differences in conversion rates and cost per sale.

Observation 1 – Conversion rates among the different search engines can vary enormously.

One of the first questions we wanted to find an answer to was, “Which of the smaller PPC engines delivers the highest conversion rate?”

As we will see later, the performance of the engines appears to vary according to the industry or company.

For the purpose of comparing conversion rates, we ran the same text advertisement across all seven engines, and used identical keywords.

We began with the specialty job posting site.

One difference between the campaigns, which lay outside of our control, was the minimum bid allowed by each engine.

We had no control over the number of exposures each ad received, as the reach of the various engines varies enormously, though we did collect enough data to compare conversion rates among the engines.

Related:  Selling the Click vs. Selling the Product: Which Strategy is More Effective for a Text-Based PPC Ad?

Here are the results:

Conversion rates compared between PPC engines
Engine Clicks Sales Conversion
Enhance 412 13 3.16%
Miva 1,061 5 0.47%
Kanoodle 4,000 6 0.15%
Mamma 14 0.00%
GoClick 244 2 0.82%
AdBrite 29 0.00%
Ask 536 11 2.05%

 

What You Need To UNDERSTAND:  For this company, in the specialty job search market, the measured conversion rate of the best performing engine was more than 50% higher that of the second best, and almost four times that of the third best.

Observation 2 – Using a single PPC engine, the nature of your business can have a major impact on conversion rates.

The next question we asked ourselves, working with the same set of test results, was “Will a single PPC engine perform as well for one company as it does for any other?”

We isolated the data from the highest performing engine from the first test, Enhance.com, and compared its performance across all four companies.  Results are for the same number of keywords, using four different text advertisements.

Here are the results:

Conversion rates compared between companies
Clicks Sales Conversion
Home Décor Site 276 0.0%
Specialty Job Site 412 13 3.16%
Newspaper Site 5,684 2 0.04%
Child Safety Site 0.00%

 

What You Need To UNDERSTAND: Small PPC performance can vary dramatically across industries.

As with so many tests, these results answer one question, and then raise others.

  • Could we have any real impact on these figures by testing a variety of different versions of the ad itself?
  • Is there a “best match” between a particular type of company and one, individual PPC engine?

Observation 3 – How does the cost per action vary among the different PPC engines?

In this case we isolated data from a single company, the publisher of a very large national newspaper.

Our purpose was to compare the final cost per action among the different engines. After all, ROI is the single most important measure of campaign success for most companies.

Again, we ran the same ad, with the same keywords and copy, simultaneously across all of the different engines.

The amount we spent on each engine was determined both by reach and minimum bids.

Here are the results:

Conversion rates compared between PPC engines
Engine Clicks Cost Sales Conversion CPA
Enhance 5,684 $584.76 2 0.04% $292.38
Miva 5,847 $332.88 5 0.09% $66.58
Kanoodle 3,582 $123.55 0.00% N/A
Mamma 419 $17.00 0.00% N/A
GoClick 977 $76.81 0.00% N/A
AdBrite 2,614 $128.92 0.00% N/A
Ask 898 $15.00 0.00% N/A

 

What You Need To UNDERSTAND: While the results, particularly the conversion, are disappointing in any context, the $66.58 CPA for MIVA was 43.4% lower than that for Google over the same period and 59.2% lower than Overture.

Related:  Selling the Click vs. Selling the Product: Which Strategy is More Effective for a Text-Based PPC Ad?

While the performance of the small engines was disappointing overall, this testing provided a number of worthwhile insights.  We were able to establish profitable campaigns for two of these companies, though the results were tenuous due to low volume.

Because of the limited sample size, it was not possible to achieve statistical validity for the tests within a practical time period.  Consequently, any site or business related decisions you would make using your own data would come with a correspondingly higher risk of error.  This is a persistent problem that comes with the territory of working with such low volume traffic sources, and makes them less attractive as primary marketing channels.

In our last brief on this topic, from March of 2004, we had found that the smaller engines at that time to be potentially viable and profitable.

Despite higher bidding on the major PPC engines, it appears that the smaller engines have failed to capitalize on the opportunity, and offer sufficient quality to build profitable campaigns.

Here is what we recommend:

  • Approach your campaigns on the smaller PPC engines with caution.
  • Test your ads across several engines to determine which, if any, can be used profitably.
  • If an engine comes close to profitability, test different ads and see if you can increase your clickthrough and reduce your cost per action.
  • If you do isolate one or more engines that can deliver a positive ROI, make sure they have sufficient reach. If you are achieving only a handful of sales per month, the profits earned may not make up for the cost of setting up, testing and managing the campaign.

In assessing the viability of each engine, you may want to use our free Maximum Bid Analysis tool for calculating the highest bid you can make and still break even on the campaign.

You can download the tool here:
http://www.meclabs.com/MaxBidAnalysis.xls

As part of our research, we have prepared a review of the best Internet resources on this topic.

Rating System

These sites were rated for usefulness and clarity, but alas, the rating is purely subjective.

* = Decent | ** = Good | *** = Excellent | **** = Indispensable

Credits:

Editor — Flint McGlaughlin

Writer — Nick Usborne

Contributors — Jimmy Ellis
Aaron Rosenthal

HTML Designer — Cliff Rainer

You might also like

Leave A Reply

Your email address will not be published.