Please Chuck, not another post about A/B Testing tips. We get it, it’s important!
Well you haven’t convinced me.
The problem with A/B testing today is that it is almost too easy, too common and marketers like to be on the cutting edge of trying new things. There is nothing flashy about A/B testing. You are not going to run into your boss’s office and say “Guess what I just did? Started an A/B test!” However when it comes to A/B testing, you're going to have to put your ego aside and remember that a marketer’s job is not to do what no one else is doing, but to find out how you can be more effective to reach your goals. Over the past six months there has been a growing sentiment to not give as much respect to A/B testing when marketers are brainstorming ways to improve user experience or gain more conversions. More and more, when discussing optimizations, I hear statements like “What else can we do besides an A/B test?” You shouldn't be thinking past an A/B test when you haven't conducted one to compare data as it pertains to your current goal.
So I’m not going to write another article that tells you how to conduct an A/B test, you’re off the hook there. But I am going to explain why thinking about an A/B test as an old tactic that you’re not excited to conduct is foolish.
We are faced with many decisions every day. Our choices in these micro moments then lay out the path for the next event in your life to take place. If you leave your house and take a different route you may run into a train and now you are late for work. You may wear a white shirt when your office is catering BBQ and now you're in a client meeting wishing that sauce stain was invisible. In these examples, I am less likely to take the route with train tracks to work and wear a white shirt the next time my office serves BBQ food, because I have tried those things and the outcome is not desirable.
I was once asked, “Explain why there are no certainties in marketing.” Nothing like a vague, philosophical question during an interview you're already nervous in because you're 23 years old trying to land a job out of college. Nevertheless, I knew I had seconds to think of something as an answer before this interview got awkward fast. But I couldn’t. Not one word. The question had literally rendered my mind inoperable and I just sat in the chair looking at a man across from me who was counting on some type of intelligent answer. Instead he got a blank face. Now I could hear the seconds just tick by as we sat in what seemed to be evolving into a staring contest. Five seconds gone by, ten seconds, fifteen – I swore at one point I could hear myself blink it was so awkward.
Finally, after about twenty seconds he said “So…are you going to answer that?” At this point, I had to be all in with not answering the question, because I can’t say “Oh yes, sorry…” and begin explaining why marketing has no certainties as if the last twenty seconds never happened. So I did what I had to do and said, “Nope.” Of course he said, “Well, why not?”
And that is when it clicked. Like a volcano of lightbulbs erupting in my mind, I suddenly felt like a born again genius.
I quickly responded, “Because that is exactly what you thought I was going to do. Answer that question. Marketers try their best to predict what people will do or what they want, but people are unpredictable. That is why there are no certainties in marketing.”
I didn't get that job, but I did get a chuckle out of him after I said that. He probably was looking for a more tailored answer like, there are no certainties in marketing because the digital landscape is ever-changing and new technology allows us to build greater user experiences…blah blah, you get the point. But I still believe my answer is more valuable than the cookie cutter one the candidate he probably hired gave him. That is the point – A/B testing is valuable because humans are unpredictable and have a chance to like one experience better than the other – in a way it is naturally in all of us.
The statement ‘there are no certainties in marketing’ also leads into how marketers feel after an A/B test. Just like an ego can get in the way of conducting an A/B test, the understanding of the word “test” can get in the way of conducting further A/B tests. Why are marketers still afraid to conduct a test and have the outcome be different than what they anticipated? They consider the test to be a “fail” – when in fact it is a success because you checked off a theory that doesn't work and you're closer to the one that does. Yet, over and over, if an A/B test doesn't yield the results that one initially thought, that marketer may be turned off by testing completely. They feel like they have wasted time and it is hard to admit that their audience didn't do what they thought they would. I get it, passionate marketers like to believe they have an absolute pulse on their target audience. But this way of thinking about A/B test results is completely backwards from how you should be thinking.
Remember, if you were certain about something you wouldn't need a test, so the fact that you conducted a test in the first place is a sign that the results could be different than what you hypothesized – because it is an educated guess. Secondly, I encourage you to not think of results from an A/B test as a failure or think you aren't as good at your job as you may have thought, because neither are true and you also have gained more from results that turned out differently than results that just confirmed what you already thought.
In 2000, Google ran its first ever A/B test to try and figure out how many search results should populate for each query. That test displayed different data sets and had to be conducted a second time due to slower page load times. Even the mighty Google learned something from its first A/B test! But that certainly wasn't a failure. A decade later, Google has been running thousands of A/B tests each year to gain further insight on how users think, react and navigate on the web.
I watched my 7-year-old cousin last week take two chips and dip each one into a different sauce to see which one he liked better. Buffalo dip was more appealing to him than Guacamole, not my style, but essentially that is an A/B test! If I was his parent, I would save money on the Guacamole by only purchasing Buffalo dip the next time.
What is the point? A/B testing is natural to us. We are always looking for what experience better suits our lifestyle or current objective. You know this. Google knows this. My 7-year-old cousin may not, but soon he will know this too. I encourage every marketer to keep A/B testing and pushing the limits of how well they know their audiences or target customers, and how much more efficient they can make a user experience to reach their goals. So there you go, I kept my word – I didn't give you an article that recommended platforms for A/B testing, what the best elements on your site to A/B test are, how long you should collect data for. I know you've heard those tips before. Instead, I hope a fresh way of looking and thinking about A/B testing sparks your desire to keep doing it, or even doing it more. Now wasn't this article more fun than the last one you read on A/B testing?
This was an A/B test in itself.