How to Conduct A/B Testing Using Google Analytics

- October 22, 2014

Where should I put my call-to-action? What’s the right messaging for this campaign? What color works better for this button – blue or green?

If you’ve ever gotten lost in the minutia of a campaign, you’ve likely asked yourself similar questions. The truth is, you can’t really know what will work until you’ve tested it. So today, we’re talking about A/B testing.

What You Need to Get Started

The first thing you need to conduct A/B testing is a hypothesis. This is exactly what A/B testing is for, after all!  For example, if you hypothesized that a blue button would convert better than a green button, you could create two pages – one with a blue button and one with a green button.

Then, you’d log into your Google Analytics account, and navigate to the Content Experiments section under Behavior:

Experiments

Then, you would name your experiment. For example, Blue Button vs. Green Button. And select a metric to test. You can only test one metric at a time. You can set up custom metrics by creating a custom goal, such as an Event.

metricsgoal

Once you’ve set up the parameters for your experiment, it’s time to configure your experiment, adding the two URLs you’re testing:

Configure

Then, add the GA tracking code for the experiment to the pages you’re testing. You can access the code by clicking “Manually insert the code.” Or, you can have the code sent to your webmaster via email.

GA Tracking Code

Once your code has been implemented, you can review and start your experiment.

Best Practices for Running A/B Testing

When creating your experiment, make sure to target enough of your web traffic to ensure a good-sized sample. While you don’t have to include 100% of your traffic in your experiment, you should include enough to make the experiment worthwhile.

And whatever you do, don’t forget your test. You’re experimenting in real-time after all, and if you’re sending half of your traffic to the “losing” page, you’d want to make changes immediately, either sending more of your traffic to the clear winner, or aborting the test altogether.

At the end of the day, if there’s no clear winner, test again. Try a different color, move your button around, test, test, and test some more. That’s what analytics is for, after all!

Comments are closed at this time.

© 2017 MoreVisibility. All rights reserved