cannot GET https://askwonder.com/api/v1/project_strategy/examples-award-winning-campaigns-didn-t-test-even-flunked-research-measures-went-hped0f3cb/scopes (401)

Success beyond ad testing

Goals

Support a client presentation by showing examples of award-winning consumer advertising and healthcare campaigns that didn’t test well or failed traditional research measures but went on to show massive impact on the marketplace. Each example should be supported by a description of the project, the methodology, and the campaign outcome.

Early Findings

  • The overall trend for companies is to rely on more data, not less. P&G has been slashing their advertising budgets for three years running, mainly by cutting out mass marketing and relying more on one-to-one targeting based on their database of more than 1m consumer IDs. According to CEO David Taylor, this helps deal with the problem of broad media targeting where half of the spend is considered wasted but there is no way to explain why or how.
  • Unilever is doing the same. According to Keith Weed, Unilever’s Chief Marketing and Communications Officer, “I’ve certainly got enough evidence, real hard evidence, showing that ads we’ve pretested perform better in the marketplace than ads we don’t. It’s inarguable proof.”
  • However, this does not mean that creativity should be weeded out. IPSOS has shown that creative quality is responsible for 75% of impact when looking at brand and ad recall. This is backed up by Clancy and Dyson who have shown that good creative is four times more important when it comes to profit vs. media efficiency. HBR quantified this in the bottom line as well, showing that 1 euro spent on an ad that ranked as highly creative had nearly double the impact of a euro spent one that wasn't creative. Importantly, this effect might be missed because the initial impact was often small but gained importance over time.
  • The perceived opposition between data and creativity may be due to the fact that different dimensions of creative work are given importance. According to HBR's study, elaboration (small details that nevertheless extended or deepened a simple story) was far more important than originality or artistic value, but many advertising agencies emphasize the latter when searching for a good idea. When used in combination, elaboration and originality together pushed sales most, but one of the most used combinations (originality plus flexibility, or linking the product to many ideas or contexts) scored one of the lowest.


The initial research has shown that data is unavailable for campaigns that tested badly but performed well in the marketplace. This is likely because campaign testing results are kept confidential by the companies running the ads. While testing companies like Millward Brown do keep a database of ads and their scores, this is also confidential and proprietary.

In general, the trend is also for companies to rely even more heavily on data and testing. This is because the changing media and consumer landscape allows for far more linkage of impact and sales and also allows for more granular data that enables one-to-one connection. The effectiveness of testing has also been proven, and both Unilever and P&G- companies that often set the trend- are leaning even more heavily in this direction. However, while the "why" of testing is set, the "how" and "what is measured" of data testing is still up for debate.

The following recommendations are based on those findings.

Proposed next steps:

You need to be the project owner to select a next step.