Why A/B Testing Is More Than a Battle of A Versus B

MoreVisibility

MoreVisibility

Last month you may have read an article about A/B Testing by my colleague, Chuck Forbes. In his article "Why You Cannot be Tired of A/B Testing," he lays out a great case for why A/B testing is not a flash in the pan tactic nor is it a tactic marketers should envision using just to prove person A’s opinion is more correct than person B’s.

As marketers, especially in the ultra-fast changing digital landscape, the only thing we can be sure of is that we can never be 100% certain of anything. And as Chuck points out, this is for a very good reason: People are unpredictable. Therefore, we must constantly challenge everything we know and continually test every assumption.

But testing, whether A/B or MVT, is not where you start when you are looking to learn more about how your audience will react to a certain message or perform a task on your site. Long before the test is created, YOU have to walk in your customers’ shoes and seek to see things from their perspective so you know WHAT to test.

So often I see people testing landing page A vs landing page B without any clear reason why other than, "they’re different." Okay, so they’re different…and when landing page B wins, will you know why?

Sometimes people will test a change of a photo on the landing page or the color of a button with no theory formed behind the change. Again, this leaves you with an outcome but no real intelligence.

For example, a sub-optimal test by Joe Marketer starts with him saying, "Let’s test our current blue button against a green button because as everyone knows green signifies action." Joe here is making a mistake assuming something that is a) not true on either point and b) certainly may not be true given other factors on the site.

Jane Optimizer knows better and comes up with a better test. "Let’s test our current blue button with a color that stands out from surrounding elements on the page. Our theory is that people are not “seeing” the button because it's blending in."

Okay, the above is very simplistic, but it illustrates the point. There should be a reason why you are testing one thing against another and that reason could come out of data, both quantitative and qualitative that you have gathered about your audience.

Here’s another example: You run a site that asks visitors to register for free but you can't understand why more people aren't registering. Many marketers will jump to an assumption that the registration form is too long. That’s a good assumption in this day of short attention spans. So you could make that your first test…but why not start well before that test and ASK people why they aren’t registering.

A simple exit survey with one question and a few choices may give you some insight into not only why people may not be registering but what else besides form length you could test to improve your registration rate.

Maybe it’s not the number of fields but the sense that the questions are more intrusive than the user feels the site warrants. Or maybe the user isn’t seeing a clear value for registering. Remember it’s how the user perceives things; you may think you are asking for valuable information to improve their site experience and are clearly explaining the benefits but what you think and what your audience “gets” could be very different.

Since we all are guilty of "tunnel" vision at some point or another, it's often helpful to engage a third party, like your digital agency, to look at your site, sales funnel or landing page with fresh eyes. While it's true they won't know your business as well as you do; it's also true that neither does your target audience.

And finally, one more thing to consider before pitting A against B: can you get complete buy-in from your organization that testing is a priority? Can/will your website team be able to implement the winning changes?

So A/B testing is not only a tactic you should be using but it should permeate the entire organization as an ongoing process with the expectation that whatever the outcome of a test, your organization wins by gaining more intelligence about what works (or doesn't) for your audience.

© 2018 MoreVisibility. All rights reserved.