<img height="1" width="1" src="https://www.facebook.com/tr?id=466594813771579&amp;ev=PageView &amp;noscript=1">
FULL BLOG

Why Traditional A/B Testing is Being Consigned to the History Books

Lauren Burgess  |  April 19, 2018

Lauren Burgess  |  April 19, 2018

        

As a marketer myself, I can understand the appeal of A/B testing. It’s relatively simple, it promises to increase conversions and it’s data-led, meaning we know our optimization efforts are being lead by pure, objective fact. It remains a favored method for web optimizers because tests are easy to set up and analyse, and, with enough time and painstaking experiments, it works.

So, what went wrong?


How Optimizers Stopped Worrying and Learned to Love the A/B Test

Back in the late 90s when brands were just excited to have a functioning website, competition was much lower and there wasn’t really a need to be constantly optimizing for the best possible customer experience. Take a look at the example below that was designed in 1995 for Trek Bikes. I think it’s safe to say that we’ve come a long way.

trek-bikes97


As time went on, the race got hotter, and companies had to step back and look at how they could create an experience that would attract and delight customers. With no guidelines in place, the experiments could begin. As you know, there’s an overwhelming number of elements and factors on a website that can potentially affect conversions, and many webmasters found themselves completely stumped. Debating the merits of blue buttons over red and Comic Sans over Times New Roman (have a look at some older websites, these were incomprehensibly popular fonts at the time) ate up hours every day.

After this, is it any wonder that A/B testing was hailed as a heaven-sent solution to their optimization woes? Suddenly they could test every aspect of the site, two variations at a time, and get concrete answers as to which version was the most effective. From here began an over-reliance on A/B testing. Whenever two or more suggestions were in the ring, the default response was to start A/B testing them. Indiscriminate testing is a huge waste of time and without proper research, context and purpose, it only holds you back from achieving your goals.

Moving on from Traditional A/B Testing

Testing just one small aspect of your digital property at a time is massively time consuming and it takes a while to gather enough data for it to be considered statistically significant, Venturebeat puts the figure at 25,000 visitors for each test. This may not be a huge number for enterprises that can achieve that traffic in a relatively short period, but it still means that the “losing” variant could be costing them thousands of conversions for the duration of the test.

Venturebeat adds:

“Things often go wrong in the preparation phase. Using classical statistical techniques requires setting a sample size and committing to a minimum detectable effect in advance, a crucial step that is frequently overlooked. Many testers also fail to realize that testing several goals and variations at once can increase errors. And, finally, data pollution may cause false positives.”

Moving focus from conversion rate optimization to digital experience optimization allows optimizers to step back from obsessing over individual elements and consider the entire journey as a cohesive whole. With the right technology, it’s possible to automatically identify improvement opportunities and make significant changes faster than ever before.

What’s Needed for Digital Experience Optimization?

Don’t get me wrong, there is still a need for testing. But we must change the way we approach it. The amount of data we can collect and the insights we now have mean that any tests can be conducted based on solid hypotheses and large gains can be made with every change.

Traditional A/B testing implies that there is an “optimal” version of a web page that will appeal to everyone. With more nuanced data about customers, businesses are now in a position to deliver personalized experiences aimed at the individual as opposed to the masses.

With more nuanced data about customers, businesses are now in a position to deliver personalized experiences aimed at the individual as opposed to the masses.

The evolved version of A/B testing is much more sophisticated. Rather than asking If a button should be blue or red, we’re comparing levels of personalization. The technology pioneered by adopters of A/B testing is now being applied to serve experiences catered to the needs of an individual — this is the true resultant innovation of having visibility into how customers feel: personalization at scale.

Undergoing the kind of digital transformation required to begin digital experience optimization based on high-level data science isn’t easy, but with digital experience intelligence platforms like Decibel, it’s possible for any enterprise to create a sophisticated optimization process that can create a truly exceptional journey for their customers.

If you’re ready to stop A/B testing based on shallow insights, click the image below to read all about how businesses are maturing beyond CRO and embracing digital transformation. This open guide is packed full of tips to help you find your place on the Digital Experience Maturity Curve and the steps you must take to improve.

 New Call-to-action

Lauren Burgess

Written on April 19, 2018 by:

Lauren Burgess

Lauren is a Digital Marketing Executive at Decibel.

Follow

Stay up to date with all the latest in online customer experience with Decibel's newsletter