A/B testing is about KPIs

When you are looking at making something better at your site, the first step should be understanding the context. Your goal is improving al element, a flow, a process, yes, but in which way? Optimization begins at gathering the data we need, narrowing it to a meaningful indicator, and only then, compare it against a control group to see if we did indeed improve.

The tools to run an experiment are plenty online, some even free and setting up an A/B testing tool is not a complex task. The key to a succesful A/B testing is establishing the exact metric we are going to use to judge our challengers and fully understand the context of the metric.

There is no perfect A/B experiment, and even on a really thought through analysis, your collected data might be tinted one way or another. Maybe there was an specific Adwords campaign for a long tail strategy and conversions were expected low. Maybe a bunch of desktop clients are finding a deep page that is not what they expected and now you have low conversions. That does not mean we cannot gather enough data to make a good educated decission on what works best.

Case study: IBPromotions

IBPromotions wanted to improve their wedding DJ page hoping to increase conversions. They did a short video to illustrate the product and open up the sales pitch page, to replace their main hero image, hoping to increase their sales for the season. 

So the hypothesis was built: 

Will a video, which requires an extra interaction to play, convince more potential buyers to engage in asking for a quote?

Once we agreed the KPI would be a correct submited form with all valid filled inputs provided, we setup a Google Tag Manager event, for analytics to receive.

After that, we setup Google A/B experiments with the two choices: One would feature the control group (50%) against the new video challenger (50%). The container was set to provide a skeltethon preload structure to avoid layout shifting and allow the rest of the content column to render independently. 

Once the page was complete, the frontend Google A/B experiment javascript files deployed would do their magic and deliver one of the versions, also pinning via a cookie which was the version this specific user was bound to see. 

The experiment ran for 3 months and results were clear and the hypothesys, correct: The new video would outperform the old hero banner by a whooping 30% more conversions. 

Explore what we do: