Five Direct Mail Testing Fallacies

Testing fallacies for direct mail

Brainstorm, create, execute, test, refine, repeat: It is the basic formula marketers can apply to virtually every aspect of their job. Yet for various reasons, companies often forgo the all-important testing step when it comes to direct mail marketing.

In fact, when we start talking to prospective clients, we can confidently assume that there is at least a 50 percent chance the company has no testing practices in place. Of those who are testing in the direct mail channel, many struggle to see merit in their efforts.

Without smart testing, there is a gaping hole in the formula. Testing reveals crucial information like:

  • Which audience will respond better to direct mail?
  • Does Offer 1 perform better or worse than Offer 2?
  • Which creative elements are sure to grab attention?

The right tests lead brands to winning combinations that most effectively move the needle.

Misconceptions can lead marketers to shun testing but doing so can put a major damper on their direct mail campaigns. We are here to save you from the same fate with a roundup of the five most common direct mail testing fallacies and reasons why you should avoid them.

Fallacy #1: Direct mail is dead

Before we even get into the testing elements, let’s debunk the most crucial myth: Despite living in a world ruled primarily by digital content, direct mail is still very much alive and thriving.

Your potential customers cannot absentmindedly scroll past a physical piece of paper like they might do with promotional emails. Direct mail effectiveness lies in our subconscious, grabbing consumer attention from the get-go so that when your corresponding email offer pops up in their inbox, they are less likely to ignore the message. It resonates with those who are not digitally active, as well as consumers who don’t own smartphones — yes, nearly 20 percent of Americans still do not have one, according to the Pew Research Center. There is also enough evidence to suggest that even millennials and Gen Z respond well to direct mail. With such positive response, many marketers still rely on direct mail to achieve their top marketing objectives, including customer acquisition and overall growth.

Fallacy #2: Testing is not necessary

Without testing, you could be unknowingly repeating the same failures. The reports may show solid performance, but you cannot be sure that you are actually maximizing the return on investment without challenging your results. It is inevitable that direct mail content — or any marketing collateral for that matter — will eventually experience fatigue, and testing errors can lead brands to either continue using stale content or fail to choose the best replacement. With this in mind, it is better to avoid assumptions, protect your ROI, and invest in testing efforts that can bring such factors to light.

Fallacy #3: The risk is not worth the long-awaited reward

We agree that testing can be risky, and it is easy to become indifferent when the first few attempts reap little reward. Plus, getting approval from company decision makers can be a challenge when the ROI is not clear or immediate. Why deal with the backlash if you don’t have to? Fair point, but the real solutions lie in performing better testing and making a compelling case for the efforts.

Remember that learning what does not work is just as valuable as learning what does.. When you know what is not working, you will never waste your funds or efforts on combinations that are not yielding positive results. Testing is an effective way to gain this impactful insight.

Fallacy #4: Creative is the most important element

Of course you want your direct mail to be thoughtfully written and beautifully designed, but the appearance is trivial if the content does not reach the right audience and encourage the desired action.

The 40-40-20 rule suggests that 40 percent of a test’s success is in the quality of the list, 40 percent is in the offer, and 20 percent is in the creative. While there is some truth to this, our data scientists see it from a 60-30-10 perspective. Since your creative is built on your audience insights and offer strategy, the ultimate value lies in testing multiple lists and offers.

Fallacy #5: A/B or “split” testing is the only way

Many companies use the A/B testing method, mainly because it is straightforward and inexpensive. However, it takes way too long to yield results — and those outcomes often  lead to only minor improvements in the direct mail strategy as you test one element at a time. Pricey multivariate testing is a plausible alternative, but why limit yourself to two average options when there is a third method that has proven to be both effective and affordable.

With FaQtor Test, you have a 400 percent better chance of finding the winning combination by leveraging the best of the A/B and multivariate methods and eliminating their downsides. The FaQtor Test methodology allows you to strategically test multiple direct mail options at once — scientifically predicting the performance of the combinations you do not test for a more cost-effective way to evaluate your data, offer and creative components, at scale. Check out this video to learn just how it works.

Want to learn more? Here are a few more of our favorite direct mail testing resources to get you started: