I still get a lot of sneers if I bring up the idea of direct mail to a client.
Comments range from “Does anybody still read that stuff?” to “Aren’t postage rates outrageous?” The simple answers are “yes” and “no.”
Let’s start with a few statistical facts (courtesy of the USPS Household Diary Study):
42 percent of recipients read of scan direct mail pieces: That means that nearly HALF of your target audience is actually stopping, for a few seconds, to read your message. If you’ve designed it properly with a strong and relevant offer/call-to-action, you might achieve a 1 percent, 2 percent or even 14 percent response rate (yes, I’ve achieved that!). Digital ads, in comparison, are lucky if they get a 0.14 percent ad clickthrough rate — and then, once they get to the landing page, you’ll be lucky if you convert 2.35 percent.
100% Guaranteed predictions – and several other opinionated guesses. Sure to entertain and amaze your friends. Give you more to talk about at the water cooler.
My friend at Forrester said, ” nicely put together, good insights, and easy to follow.”
“John. I just watched the video and made a lot of notes in between laughing out loud. You’re hitting the nail on the head. People get so wrapped up in gazing into the Big Data crystal ball that they forget the basics.” Tech Target magazine columnist.
So, your company wants to increase revenue and adoption by making some marketing site tweaks. They want more conversions, more clicks, more shares, and more users. What do they tell you to do first? Well, A/B test! Compare two versions of a page, define a key goal (ex. clicks), and see if you get more clicks. But, does this actually work? Is it really the approach you should take? Let’s look at the data.
By nature, an A/B test is an experiment that assesses multiple (often 2) versions of a feature or page relative to a defined metric.
This article focuses on superficial A/B tests — the testing of cosmetic changes that distract teams from delivering meaningful customer value.
During rough economic times, it’s easy for those who control the budget to say that if response rates are down, they don’t want to invest in testing—”You can’t spend money if you’re not making money.” To certain executives, this actually makes sense. But others, the wise ones, know that the time to spend more marketing dollars is when sales are down.
It’s true that there is a risk involved in testing new ideas in an effort to “beat the control” and increase response rates. Testing takes an investment in time and resources, often including additional funds. However, the outcome is often worth the risk.
You want to test, not only to increase your ROI, but also to learn. The more you know about what works the best, the better you can market to segments that emerge as your marketing programs evolve.
Website optimization testing is becoming an increasingly common practice. It provides quantitative data that proves how making a change to your website will affect order conversion and revenue. I run a lot of tests across all my clients’ sites; my current average is around 400 a year. Collecting data from the past 3 years of testing, only 30% of those experiments were “wins” and positively affected revenue. If you flip that, then 70% of all tests run don’t provide a lift. That seems like a lot of wasted effort.
In the Golden Age of Catalogs (80’s & 90’s) Catalog Age Mag commissioned Tracey Emerick and John Miglautsch to build the Catalog Management Institute. The most popular part of that curriculum lives again!
According to the Direct Mail Association (DMA), response rates took quite the jump in 2016 with a 5.3% response rate to house lists and 2.9% to prospect lists. These are the highest levels the DMA has seen since 2003.