5 fallacies of testing direct mail

Lies vs truth text on yellow arrows on asphalt ground, feet and shoes on floor, personal perspective footsie concept

Brainstorm, create, execute, test, refine, repeat: It's the basic formula marketers can apply to virtually every aspect of their job. Yet for various reasons, companies often forgo the all-important testing step when it comes to direct mail marketing.

In fact, when we start talking to prospective clients, we can confidently assume that there's at least a 50 percent chance the company has no testing practices in place. Of those who are testing in the direct mail channel, many struggle to see merit in the efforts.

Without smart testing, there's a gaping hole in the formula. Testing reveals crucial information like:

  • Which audience will respond better to direct mail?
  • Does Offer 1 perform better or worse than Offer 2?
  • Which creative elements are sure to grab attention?

The right tests lead brands to winning combinations that most effectively move the needle.

Misconceptions can lead marketers to shun testing, but doing so can put a major damper on their direct mail campaigns. We're here to save you from the same fate with a roundup of the five most common fallacies and reasons why you should avoid falling prey to them.

Fallacy #1: Direct mail is dead
Before we even get into the testing elements, let's debunk the most crucial myth: Despite living in a world ruled primarily by digital content, direct mail is still very much alive and thriving.

Your potential customers can't absentmindedly scroll past a physical piece of paper like they might do with promotional emails. Direct mail grabs their attention from the get-go so that when your email pops up in their inbox, they're less likely to bypass the message. It resonates with those who don't engage in social media, as well as consumers who don't own smartphones - yes, 23 percent of Americans still don't have one, according to the Pew Research Center. There's also enough evidence to suggest that even millennials respond well to direct mail. With such positive response, direct mail helps brands gain new customers and significantly increase their annual growth.

Fallacy #2: Testing isn't necessary
Without testing, you could be repeating failures without even knowing. The reports may show solid performance, but you can't always be sure that you're actually maximizing the return on investment. It's inevitable that direct mail content - or any marketing collateral for that matter - will eventually experience fatigue, and testing errors can lead brands to either continue using the tired content or fail to choose the best replacement. With this in mind, the better move is to avoid assumptions and invest in testing efforts that can bring such factors to light.

Fallacy #3: The risk isn't worth the long-awaited reward
We agree that testing can be risky, and it's easy to become disenchanted when the first few attempts reap little reward. Plus, getting approval from company decision makers can be a challenge when the ROI isn't clear or immediate. Why deal with the backlash if you don't have to? Fair point, but the real solutions are performing better testing and making a compelling case for the efforts.

Remember that learning what doesn't work is just as valuable as the winning combination. When you know what isn't working, you'll never waste your funds or efforts on combinations that aren't yielding positive results. Testing is an effective way to gain this impactful insight.

Fallacy #4: Creative is the most important element
Of course you want your direct mail to be thoughtfully written and designed, but looks won't matter if the content doesn't reach the right people and encourage the desired action.

The 40-40-20 rule suggests that 40 percent of a test's success is in the quality of the list, 40 percent in the offer and 20 percent in the creative. While there is some truth to this, our data scientists see it from a 60-30-10 perspective. There's ultimately more value in testing multiple lists and offers than there is in experimenting with several creative concepts.

Fallacy #5: A/B or "split" testing is the only way
Many companies use the A/B method, mainly because it's straightforward and inexpensive. However, it takes way too long to yield results - and those outcomes often only lead to minor improvements in the direct mail strategy.  Some may say that the more pricey multivariate test is a plausible alternative, but we beg to differ at SeQuel Response.

There exists a testing strategy that leverages the best of the A/B and multivariate methods and eliminates their downsides for a more cost-effective way to test at scale. We call it FaQtor Test, and it can give your company a 400 percent better chance of finding the winning combination. The method allows you to strategically test multiple direct mail options at once, and can even scientifically predict the performance of the combinations you don't test.

Are we starting to ease your doubts about testing direct mail?