So, your company wants to increase revenue and adoption by making some marketing site tweaks. They want more conversions, more clicks, more shares, and more users. What do they tell you to do first? Well, A/B test! Compare two versions of a page, define a key goal (ex. clicks), and see if you get more clicks. But, does this actually work? Is it really the approach you should take? Let’s look at the data.
By nature, an A/B test is an experiment that assesses multiple (often 2) versions of a feature or page relative to a defined metric.
This article focuses on superficial A/B tests — the testing of cosmetic changes that distract teams from delivering meaningful customer value.
During rough economic times, it’s easy for those who control the budget to say that if response rates are down, they don’t want to invest in testing—”You can’t spend money if you’re not making money.” To certain executives, this actually makes sense. But others, the wise ones, know that the time to spend more marketing dollars is when sales are down.
It’s true that there is a risk involved in testing new ideas in an effort to “beat the control” and increase response rates. Testing takes an investment in time and resources, often including additional funds. However, the outcome is often worth the risk.
You want to test, not only to increase your ROI, but also to learn. The more you know about what works the best, the better you can market to segments that emerge as your marketing programs evolve.
Website optimization testing is becoming an increasingly common practice. It provides quantitative data that proves how making a change to your website will affect order conversion and revenue. I run a lot of tests across all my clients’ sites; my current average is around 400 a year. Collecting data from the past 3 years of testing, only 30% of those experiments were “wins” and positively affected revenue. If you flip that, then 70% of all tests run don’t provide a lift. That seems like a lot of wasted effort.
In the Golden Age of Catalogs (80’s & 90’s) Catalog Age Mag commissioned Tracey Emerick and John Miglautsch to build the Catalog Management Institute. The most popular part of that curriculum lives again!
According to the Direct Mail Association (DMA), response rates took quite the jump in 2016 with a 5.3% response rate to house lists and 2.9% to prospect lists. These are the highest levels the DMA has seen since 2003.
Introduction to the WDMA – Starts with the greatest case study for Direct Marketing EVER. Some discussion of what Direct Marketing means and why WDMA is your best source for Education, Conversation and Congregation.
David Ogilvy is one of my heros. He explains the power of Direct Marketing. He also encourages general ad agencies to use DM to train their people. Much of what he hoped for has happened – so now everyone measures, everyone generates responses (at least clicks and likes) but much of the art of testing (and so the power) has been forgotten.
The WDMA is committed to showing how Direct Marketing applies to the media and methods of today’s marketers – without leaving the proven fundamental principles of scientific marketing.
“President Barack Obama transformed modern day campaigning by elevating the importance and use of data. Since then campaigns have prioritized it. Hillary Clinton has been building her data operation since she launched her campaign, but Trump has largely dismissed its importance.” NBC News May 31 2016