Testing, 1 2 3: Going Against Digital Marketing Best Practices
As my few first weeks of PPC training here at Inflow come to a close, I’d like to share my thoughts on how best practices can vary and be contradicted in a litany of topics within paid advertising, but here I’ll focus on two key areas.
Now, in general, best practices are in place for a reason—because they work! They have been created through a time tested review of what is beneficial. However, sometimes the best results are achieved through an approach that may be counterintuitive. This is why it is important to A/B test.
Take for example a tale of two landing pages.
Landing Pages – A
When interacting with potential customers, the best interactions come from sending them to pages with relevant content based on their search queries. With this in mind, it is best practice for an eCommerce campaign to make product interaction as easy as possible for the user once they have hit your site. In terms of specific queries, it makes sense to send your customer to the most specific product page relevant to their search. When you do, you are decreasing the clicks to purchase.
Landing Pages – B
However, as Chris Kostecki from Keurig Inc. shows, the best results may not come from best practices. A test was performed for one of his clients to check the conversion rate of product purchases based on two different landing pages. The first seemed to be an obvious choice, as it led straight to the product page and provided the least amount of choices on the user side to get to the overall goal.
The second required the user to do a bit more digging to hit the add to cart page and therefore wasn’t quite as intuitive. Surprisingly enough, Chris found out that the extra clicks from the second advertisement turned out to be the better choice. Purchase conversions were higher by 18 percent in the second ad and actually increased the average order value. Going against best practices worked in this scenario. I must stress again that best practices are fantastic guidelines for a campaign, but A/B testing can sometimes yield surprising results.
Ad Copy – A
Creating ad text can be one of the most creative parts of a digital campaign. Crafting a message that appeals to a consumer enough to pique their interest and check out your site can be a hard thing to do. And as always, there is a set of best practices in place to help you figure out what works most often. A standard best practice for text advertisements consists of a 1st description line that has a product benefit or feature that highlights why the product will be beneficial to them, a 2nd description line that uses a call to action or an appealing offer for the viewer to move forward, and a title that effectively conveys what the product brand is and relates to the keywords. And for optimizing these advertisements? Simple. Create ad copy that is closely related to the first ad but rephrased and test for the better one. Delete one, keep the other, rinse and repeat, over and over.
Ad Copy – B
However, the smallest of tweaks can have a drastic impact on a campaign. For Perry Marshall, this insight came from an A/B test that involved switching description lines. Perry found this increased his CTR from .1% to 3.6% and effectively turned an ad group with marginal user interaction into a star performer. As marketers, we are quite lucky in the fact that every campaign has set critical metrics that end up influencing all decisions for optimization.
A/B testing in digital marketing can provide surprising results that contradict best practices. Never forget to think outside the box every now and then to get the results you are hoping for in the end.