A/B testing

A/B testing compares two or more versions of a webpage, app, screen, surface or other digital experience to determine which one performs better. Use conversion rates and user engagement to reveal whether a specific version had a neutral, positive or negative effect. Results help you improve campaigns, customer experience and conversion and sharpen audience targeting.

A/B Testing

Rise to the challenges of A/B testing.

To be successful with A/B testing, you must address system and governance challenges that can impact outcomes and campaign decisions.
Insufficient traffic volume.
Insufficient traffic volume.
Lack of defined testing rules.
Lack of defined testing rules.
Undefined website goals.
Undefined website goals.
No targeted metrics.
No targeted metrics.


Reap the benefits of A/B testing.


A/B testing helps you to optimise campaigns by providing measurable outcomes that can be compared to reveal preferences, interests, risks and more.
Compare results and take action.
Compare results and take action.
By running tests that reveal comparative audience behaviours, you can generate actionable data that helps improve campaigns and strengthen audience segmentation and targeting.
Data-driven decisioning.
Data-driven decisioning.
Testing allows you to rely on statistical confidence to refine targeting and messaging rather than leaning on executives or gut instinct.
Improved conversion rates.
Improved conversion rates.
By revealing better engagement rates through iterative testing of multiple content elements, offers and products, you can optimise campaigns to yield better results.
Reduce campaign risks.
Reduce campaign risks.
A/B tests help reduce the risk of launching ill-informed campaigns by creating a testing culture founded on data.
Adobe can help.
You can use automated testing and optimisation capabilities in Adobe Target to discover the experiences, messages and offers that best engage and convert visitors, without coding and set up hassles. 


A/B testing is just one piece of the analytics puzzle.

Learn how Adobe helps brands get a complete view of their analytics to deliver personalised experiences.
LJ Jones, Director of Optimisation, Progrexion
“We use Target to prove out - or disprove - hypotheses. This gives a little more power, a little more freedom.” 
-LJ Jones,
Director of Optimisation, Progrexion

Ellen Lee, Senior Vice President of Global Digital, Hyatt


“We didn’t do much A/B testing at all…now it’s front and centre in everything we do.”
 - Ellen Lee,
Senior Vice President of Global Digital, Hyatt


A/B Testing FAQ.

Is A/B testing foolproof?
No. Effective testing requires a strong hypothesis and a commitment to best practices related to design and traffic requirements.
Does A/B testing hurt SEO?
No. Despite popular myths about negative testing effects because of duplicate content, tests actually improve the functionality of your site, positively impacting SEO.

How many variations should I run?
Test as many variations as you need, but keep traffic requirements in mind if you want results to be statistically significant for each variable. You can also compare variations using different audience segments.
Which design elements should be tested?
Any page element, such as shapes, colours, sizes and messaging, can be tested. You can test entire digital experiences, single pages or full customer journeys for their impact on metrics and conversion goals.
How is A/B testing different from multivariate testing?
Multivariate tests examine multiple combinations of elements at one time, which can help to reveal the relative contribution elements have as they interact to trigger engagement.
What is a null hypothesis?
A null hypothesis occurs when results show no significant statistical difference in engagement or conversion rates across two or more versions and any observed difference is likely due to sampling or experimental error.

Want to know more?

Find out how Adobe can help you and your business.
call +44 1628 590 300