A/B Testing
Pritam Roy
Introduction
What if the secret to doubling your conversions was as simple as changing a headline or button color in your landing page, ad, or email?
It’s not magic. That’s the power of A/B testing!
If done right, A/B testing can turn guesswork into a guaranteed success. But with so many different elements on your website, where do you begin?
In this post, we’ve gathered 29 incredible A/B testing examples that showcase how small experiments can lead to big wins. We’ve picked these examples from real-world A/B testing case studies to inspire and guide you to make smarter, data-backed decisions.
Whether you’re optimizing landing pages, emails, or ads, these examples hold actionable insights you can apply right away.
Let’s get started.
Top 29 AB Testing Examples You Need To Check Out
1. Campaign Monitor: How Dynamic Text on Landing Page Boosted Conversions by 31.4%
When optimizing PPC campaigns, relevance is key. ConversionLab, a Norwegian digital agency, executed an A/B testing project for Campaign Monitor that reveals how small changes can yield significant results.
Problem: Despite attracting traffic using PPC campaigns, Campaign Monitor was still struggling to drive conversions. ConversionLab hypothesized that aligning landing page text with user search queries could improve perceived relevance.
Solution: Using Unbounce’s dynamic text replacement (DTR), they customized landing page headlines and CTAs to mirror users’ search terms. For example, if someone searched “design emails.”
The landing page text changed the word from “make” to “design.”
Results: When the experiment ended at 77 days, Campaign Monitor saw a 31.4% increase in trial sign-ups.
2. HubSpot Academy: How Hero Image Changes Can Impact User Behavior and Conversions
Hero images on the homepage can shape user behavior and conversions, as this a/b testing case study from HubSpot Academy reveals.
Problem: Despite over 55,000 page views, only 0.9% of HubSpot Academy users watched the homepage video. Of the viewers, nearly half finished watching it.
A/B test method: HubSpot ran an a/b testing project with three variants, testing colorful imagery, animations, and headlines.
Variant A: The control
Variant B: With colorful text and shapes
Variant C: With animated images, color, and movement.
Results: Variant B, featuring vibrant visuals and a dynamic headline, boosted signups by 6%, projecting 375 more monthly signups. This ab testing website example highlights how small tweaks can improve engagement and conversions.
3. Grene: How a Mini Cart Redesign Doubled Purchases
Grene, a top eCommerce brand based in Poland that sells different types of agriculture-related goods, used a/b testing to optimize its mini cart design and achieved a 2x boost in purchases.
Problem: Users misinterpreted the "Free Delivery" USP as a clickable button, struggled to find item totals, and had to scroll for the "Go To Cart" CTA. This was causing a lot of friction in the purchase process.
A/B testing method: Using VWO, Grene implemented three changes: a prominent top CTA, visible item totals with “remove” buttons, and a larger “Go To Cart” button.
Variation A: The control
Variation B: With a prominent top CTA, visible item totals with “remove” buttons, and a larger “Go To Cart” button.
Results: The experiment ran for 36 days and Grene improved cart visits by 3.05%, raised conversion rates by 1.96%, and doubled purchase quantities. This is another compelling a/b testing project worth emulating.
4. FSAstore.com: How a Simplified Website Navigation Increased Revenue Per Visitor by 53.8%
People prefer websites that are easy to use and navigate. According to a study, 94% of users ranked easy navigation as the most important website feature:
Thus, simplifying a website’s navigation can work wonders for eCommerce sites, as FSAstore.com discovered in this ab testing website example.
Problem: Serving 35+ million Americans with FSAs, their site’s category pages overwhelmed users with too many options that were deterring purchases.
A/B testing method: To address this, the team ran an a/b testing project by removing the subheader from site navigation and simplifying the design. They compared the original layout to a streamlined version.
Results: This ab testing case study revealed a 53.8% increase in revenue per visitor, showcasing how small changes can lead to big wins.
5. Going: How Changing Three Words in CTA Led to a 104% Increase in Conversions
Going is a key player in the competitive online travel market, helping travel enthusiasts travel the world without breaking the bank. However, Going struggled with converting visitors into premium subscribers. Despite great offers, their “Sign up for free” CTA wasn’t conveying the full value of the premium plan.
Problem: Their subscription model lacked clarity so many visitors were hesitant to commit.
A/B testing method: Going tested a simple change, replacing “Sign up for free” with “Trial for free,” using Unbounce’s A/B testing solution to highlight the premium plan’s value.
Variant A: With “Sign Up For Free” CTA
Variant B: With a “Trial For Free” CTA
Results: This minor tweak led to a 104% increase in trial starts month-over-month. This is another inspiring ab testing case study showcasing how smaller tweaks can significantly impact conversions.
6. Christopher Cloos: How Switching to a Conversational Popup Boosted Conversions by 15.38%
In a competitive sunglasses market, Christopher Cloos faced the challenge of converting first-time visitors into customers.
Problem: New users felt overwhelmed on their first visit because of poor messaging which led to low engagement.
A/B testing method: The team tested a conversational popup instead of the classic one, engaging visitors by asking what products they were interested in while offering the same deal.
Results: The change led to a 15.38% increase in conversions. This ab testing case study is a prime example of how small messaging tweaks can make a big difference in conversions.
7. Obvi: How Adding a Countdown Timer Increased Conversions by 25.17%
Obvi, a women's health and wellness brand leading in collagen, superfoods, and women's hormone products struggled to boost conversions.
After examining their analytics and tracking customer behavior, Obvi noticed a gap: visitors added items to their cart but abandoned them before purchasing.
Problem: High cart abandonment rates despite strong initial interest.
A/B testing method: To address this, Obvi tested a cart abandonment popup offering a 10% discount with an auto-redeem feature, a dynamic coupon code, and a countdown timer to spark urgency.
Results: This popup led to a 25.17% boost in conversions, demonstrating the power of urgency and personalized offers.
8. Expoze: How a Homepage Background Design Boosted CTA Clicks by 25%
A visually compelling website can guide user behavior and boost engagement. Expoze.io faced a challenge: their homepage had poor contrast, making navigation difficult and affecting CTA visibility.
Problem: Low contrast on the homepage and poor navigation hindered user interaction and navigation.
A/B testing method: The team tested several designs to enhance visual appeal and highlight key sections.
They also used AI-generated eye-tracking and A/B heatmap analysis to optimize user focus.
Results: The new design increased CTA clicks by 25%, with version B attracting 40% more attention to crucial areas. This is one of the inspiring A/B testing case studies for marketers looking to optimize their home page to boost engagement and conversion.
9. First Midwest Bank: How A/B Testing Redefined Conversion Strategies
First Midwest Bank a community bank offering personalized financial solutions for individuals, families, and businesses in Missouri broke the norm of the traditional conservative banking industry.
This A/B testing case study demonstrates how the bank explored innovative A/B testing strategies to boost conversions.
Problem: Banking sites are bound by strict brand guidelines and traditional aesthetics, which can stifle creativity and hinder customer trust, especially when asking for sensitive information.
A/B Testing Method: The bank tested human imagery tailored to specific state demographics and challenged the "above the fold" rule, placing forms below the fold.
Variation A: With the signup form on the right-hand side
Variation B: With the signup form below the fold
Results: These creative tests resulted in a 195% overall conversion increase, proving the power of innovative A/B testing in rigid industries.
10. WorkZone: How a Simple Testimonial Page Design Change Boosted Form Submissions by 34%
WorkZone, a US-based project management software company, sought to optimize its lead generation page for better conversions by adding a testimonial section next to the page.
Problem: Fact: Including customer testimonials on important pages can increase conversions by 34%. But WorkZone noticed that the customer testimonial logos placed next to the demo request form distracted visitors, reducing the likelihood of form submissions.
A/B testing method: WorkZone tested a new design by changing the testimonial logos from color to black and white to see if it would draw more attention to the form.
Variation A: Control with colored testimonial logos
Variation B: With black and white logos
Results: After 22 days, the test revealed a 34% increase in form submissions. This AB testing case study also highlights the effectiveness of small design tweaks in driving conversions.
11. Thrive Theme: How Adding Testimonials in Sales Landing Page Increased Sales by 13%
According to a Nifty study, 36% of the top-performing landing pages feature testimonials from satisfied customers.
Thrive Themes decided to rethink their sales landing page by adding a human touch: customer testimonials.
Problem: Their current banner focused on product features but lacked the emotional pull of real customer experiences.
A/B test method: The team launched a 6-week test, comparing the original page to a version featuring testimonials that highlighted customer satisfaction and success stories.
Results: The page with customer testimonials drove a 13% sales increase, raising the conversion rate from 2.2% to 2.75%. This AB testing example proves that relatable stories can turn browsers into buyers.
12. Performable: How a Bold Red CTA Button Boosted CTR by 21%
Which CTA button color drives more click-through rate?
This A/B test example by Performable answers this question. The marketing team at Performable wanted to see whether the button color can impact user actions on their homepage.
Problem: Amid debates over button colors, the team questioned whether the “safe” green button, symbolizing “go,” was truly optimal for their homepage conversions.
A/B test method: They tested two identical homepages, swapping the button color from green to red.
The red button aimed to stand out visually, even though it traditionally signals “stop.”
Results: The red button achieved a 21% higher click-through rate, proving bold, attention-grabbing designs can trump convention in driving user actions.
13. HubSpot’s Email A/B Testing Example
Does text alignment in emails influence CTA clicks?
Here is their A/B testing case study.
Granted, engaging email subscribers is no small feat. After all, the average person receives at least 120 emails per day—which include newsletters, personal communications, promotions, and more.
So HubSpot tested how text alignment affects click-through rates.
Problem: HubSpot aimed to optimize the user experience in their weekly emails, hoping improved readability would lead to more CTA clicks.
A/B test method: The control emails used centered text.
While variant B featured left-aligned text for comparison.
Results: Surprisingly, left-aligned emails underperformed, with fewer clicks overall. Only 25% of left-aligned emails surpassed the control, proving that alignment can subtly influence user engagement and effectiveness in email marketing.
14. Avon: How Personalizing Product Category Page Increased Conversion Rate by 96%
Today’s customers want brands to personalize their online experiences.
Here is the proof: 72% of customers say they are more likely to buy from a brand if it can consistently provide them with a personalized experience:
Avon conducted an A/B testing project aimed at delivering a personalized touch for makeup buyers.
Problem: Avon sought to replicate the personal connection of in-store shopping online. The underperforming makeup category was a prime target for improvement, with surveys showing that “eye color” filters were key to engagement.
A/B Test Method: Avon introduced a widget asking users about their eye color and customizing product recommendations. They also used pop-ups, smart ribbons, and post-purchase thank-you messages to enhance the experience.
Results: This 10-day test achieved a 96.63% conversion rate increase and 73X more checkout views.
15. Wistia: How Modifying Their Pricing Page Doubled Their Sales
When looking for a video platform, you’ll like to choose between YouTube, Vimeo, and Wistia. So the competition for Wistia in this category is insane hence they had to be smart to increase sales.
Problem: Visitors already familiar with Wistia were deterred by complex pricing and unclear value which led to high bounce rates.
A/B test Method: The original pricing page tied features to payment tiers.
Wistia tested a simpler structure: all plans offered full features, with higher tiers based on video upload limits.
Results: After a few months of A/B testing, Wistia doubled sales and boosted revenue by 46%. Streamlined, value-focused pricing empowered customers to choose with confidence.
16. Underoutfit: Boosting Ad Returns with User-Generated Content
Paid advertising costs have surged.
Here is a simple breakdown of the online advertising costs currently:
As a result of the surge, Underoutfit decided to use A/B testing to optimize performance.
Problem: Underoutfit needed to enhance Facebook brand awareness while improving ROI.
A/B test method: They partnered with creators to develop branded user-generated content, pairing it with product ads. Through split testing, they compared traditional product ads to those combined with engaging, authentic videos. They also worked with Meta Creative Shop to ensure content aligned with social media best practices.
Results: The variant with branded content outperformed with a 47% higher click-through rate and a 28% increase in return on ad spend. This split testing example highlights the importance of using UGC in PPC campaigns.
17. Csek Creative: Redefining Homepage Tagline for Better Engagement
Your homepage tagline can make or break your efforts to reduce bounce rates on your website.
Csek’s homepage tagline was “Csek Creative is a Kelowna based digital agency that delivers the results that make business sense.”
Even someone who isn’t a professional copywriter can sense the vagueness in this tagline.
Problem: Csek Creative’s vague tagline left visitors puzzled, causing higher bounce rates. The goal? Test if a clearer, more specific tagline could reduce drop-offs.
A/B testing method: They replaced the original tagline with a sharper alternative: "Csek Creative is a digital agency that helps companies with their online and offline marketing needs."
This split testing example focused on clarity to increase engagement.
Results: After testing with 600 visitors, the new tagline boosted click-throughs to other pages by 8.2%.
Takeaway: This a/b testing case study shows the power of clarity in homepage messaging to improve click-through rates.
18. TechInsurance: How Having a Dedicated Landing Page Boosted PPC Conversions by 73%
Sometimes all that you need isn’t boatloads of traffic but optimizing the effectiveness of your PPC campaigns to convert the little traffic you get.
This is the challenge TechInsurance faced.
Problem: TechInsurance’s PPC traffic was directed to their general homepage, offering broad appeal but failing to address audience-specific needs. The team hypothesized that a tailored landing page could resonate better with PPC visitors.
Solution: They designed a dedicated landing page aligned with PPC ad messaging, creating a personalized experience for visitors.
This page was then A/B tested against the homepage.
Results: The specialized landing page drove a 73% increase in conversion rates, proving that aligning content with audience expectations is key. This a/b testing case study demonstrates how thoughtful split testing examples can optimize campaigns.
Pro tip: With the help of Fibr AI, you can create personalized landing pages that match every ad to boost conversions. See how Fibr AI’s ad personalization works here!
19. Beckett Simonon: How Incorporating Storytelling Increased Sales by 5%
Who doesn’t love stories? 92% of consumers want brands to employ storytelling in their ads.
This A/B testing example highlights the importance of creating ads that feel like a story.
The Problem: Beckett Simonon, a brand known for handcrafted leather shoes and sustainability, sought to improve paid acquisition effectiveness and boost conversions.
A/B testing method: They introduced a storytelling panel to their website, focusing on sustainability and craftsmanship.
By resonating with customer values, this a/b testing project delivered impactful results.
Results: Sales increased by 5%, and the campaign generated an annualized ROI of 237%. This example of a/b testing highlights how emotional engagement through storytelling can outperform standard marketing tactics. This makes it a compelling ab testing case study for brands to emulate.
20. Mall.cz: Using Larger Product Images Increased Sales by 9.46%
A picture is worth a thousand words. This is the adage in the marketing landscape that’s not dying anytime soon. But how big or small should the picture be?
Let’s find out from this A/B testing example by Mall.cz.
The Problem: Mall.cz, a leading Czech eCommerce platform, sought to enhance online sales by optimizing its product pages.
A/B Testing Method: In this a/b testing project, they tested two layouts. One featured moderately sized product images with descriptions below the image:
While the other variation showcased larger images with descriptions revealed on mouseover.
This split testing example aimed to improve user experience and drive conversions.
Results: The second variation boosted sales by 9.46%, proving that larger visuals with interactive elements resonate better with customers. This example of a/b testing highlights how thoughtful design can significantly impact eCommerce success.
21. Zalora: How Optimizing Product Pages Increased Checkout Rate by 12.3%
Zalora is a leading fashion eCommerce site in Asia-Pacific. Given the serious competition in the APAC online fashion market, brands must ensure competitive pricing while assuring quality, accessibility, and ease of transaction to stand out from the crowded space.
The Problem: Zalora suspected that customers were unaware of key features like free returns and delivery because their original product page was like this:
A/B Testing Method: To address this, Zalora redesigned their product pages, emphasizing these perks. They then ran a split testing project, comparing the original layout to one with improved visibility for the free return policy and standardized call-to-action buttons.
Results: The optimized product page increased checkout rates by 12.3%, demonstrating that clarity and consistency in design can boost customer confidence and drive sales. This a/b testing project highlights the value of strategic layout improvements.
22. L’Axelle: How Using Action Words in the Headline Boosted Conversions by 93%
There is power in using action words in your headlines. That’s what this A/B testing case study is about to prove.
The problem: L’Axelle, a brand offering sweat reduction products, struggled with low conversion rates on its landing page because its original headline was 'Feel fresh without sweat marks'
A/B testing method: They tested two headline variations—one passive ("Feel fresh without sweat marks") and one using action words ("Put an end to sweat marks!"). The goal was to see if a more direct, engaging message would boost conversions.
Results: The action-oriented headline led to a 93% increase in conversions, proving the power of strong, compelling copy in driving customer action. This A/B testing example underscores the importance of clear and motivating language in marketing.
23. Codecademy: How Optimizing Pricing Plans Increased Annual Plan Subscriptions by 28%
Codecademy is a renowned American online platform that offers free coding classes in 12 different programming languages including SQL, Java, and others.
Even though their free classes were successful, their annual subscription plan was not performing well.
The Problem: Codecademy struggled to convert users from their free plans to the annual pro subscription. Despite strong traffic to the pricing page, sign-ups remained low.
A/B testing method: The team tested new pricing layouts, placing the annual plan first and showcasing savings in dollar amounts rather than percentages using the “Rule of 100” to appeal to user psychology.
The pricing page showed their 6-months plan as the most popular, highlighting the money users could save in percentage.
They reordered their pricing plans making sure the annual subscription plan appeared on the left of the page and added the “popular” label on it.
Results: This A/B testing project boosted annual plan subscriptions by 28% and overall page conversions.
24. NuFACE: How Including an Incentive Increased Orders by 90%
Statistics reveal that the average eCommerce store loses 75% of sales due to cart abandonment.
NuFACE, a popular anti-aging skincare brand sought to improve its online presence and boost sales.
The Problem: Despite customers showing high interest in its products, the company faced a challenge with cart abandonment. Customers were not finalizing their purchases.
A/B testing method: The company tested offering a free shipping incentive for orders over $75 to encourage customers to complete their purchases.
NuFACE original checkout page:
NuFACE checkout page with a free shipping incentive:
Results: This A/B testing project resulted in a 90% increase in orders and a 7.32% boost in average order value.
25. PayU: How Minor Changes to the Checkout Page Increased Its Conversions by 5.8%
PayU, a popular Indian fin-tech firm providing a wide range of financial solutions for both local and cross-border customers in upcoming markets understands the importance of having a simple, intuitive, and convenient checkout process.
However, the company faced a problem that this A/B testing case study aims to highlight.
The Problem: PayU, was facing a high cart abandonment rate which led to lost sales. Analysis revealed that its checkout page was a major contributor to drop-offs.
A/B testing method: PayU hypothesized that simplifying the form would increase conversions.
The original checkout page had an email field in the form:
The A/B testing example involved eliminating the email address field from the checkout form, asking only for a mobile number as shown below:
Results: This simple adjustment led to a 5.8% boost in conversions, demonstrating how small tweaks in the checkout process can have a significant impact on revenue.
26. Groove: How Adding a Video Testimonial Boosted Conversions by 100%
Did you know that adding video testimonials to your sales pages can boost conversion rates by 80%?
In case you don’t know, here is an example of A/B testing by Groove to inspire you.
The problem: Groove, a helpdesk software company, attracted substantial traffic to its blog, but its conversion rate was stagnant at just 2.3%. Analysis revealed that the website’s messaging failed to connect with users or inspire trust.
A/B testing method: Groove hypothesized that adding a customer testimonial video to the hero section would build trust and engage visitors.
Here is the original hero section with just an image:
They added an engaging video testimonial of their customer in place of the static image:
Results: The A/B test results showed a dramatic increase in conversions, jumping to 4.3%, effectively doubling the rate. This highlights how user-generated content can build trust and significantly boost conversions.
27. ShopClues: How Optimizing the Homepage Increased Visits-to-Order by 26%
ShopClues is a leading eCommerce company operating within the Indian market. As a new entrant in the fashion market, it’s facing competition from established brands like Amazon, Flipkart, and many others.
The Problem: After running several A/B tests, ShopClues noticed that their homepage’s navigation wasn’t leading to meaningful conversions. Despite high clicks on the "Wholesale" link, users weren’t engaging with other categories.
A/B testing method: The team tested repositioning the "Wholesale" section and replacing it with other marketing categories like "Super Saver Bazaar." The goal was to guide customers toward more relevant pages to improve navigation and conversion.
The original homepage looked like this:
They optimized it to look like this:
Results: This change boosted click-through rates by 50% and improved visits-to-order by 26%.
28. Shaw Academy: How Addressing User Doubts and Concerns Increased Sales 5X
For many companies, even when signing up for their free trial plans, they require you to provide your payment details on their billing page. But this can leave users wondering when they will be charged after the trial period ends.
This is the challenge Shaw Academy, an online education platform for interactive classes faced, and this A/B test example will demonstrate how they worked around it.
The problem: Shaw Academy’s conversion rates were stagnant, particularly on the billing page, where many users dropped off. The lack of reassurance caused uncertainty and discouraged potential customers from proceeding.
A/B testing method: Shaw Academy added a clear message on the billing page: “We’ll email you 3 days before your trial ends.” This reassured users, addressing their concerns about hidden charges and trial expiration.
Results: The tweak led to a 2X increase in conversion rates and 5X growth in sales, proving the power of addressing customer concerns at key moments in the funnel.
28. Ben: How a Minor Change on the Product Page Increased Conversion by 17.63%
Ben, a Dutch telecom provider, noticed that visitors often overlooked the option to select their preferred phone color which was leading to confusion and missed opportunities.
A/B testing method: By analyzing user behavior through VWO Insights, Ben found that the color palette lacked visibility and clarity.
The original color palette looked like this:
The team redesigned the product page, making the color selection tool more prominent like this:
They then tested the new layout against the original in a split testing example.
Results: In just two weeks, this simple tweak boosted conversions by 17.63% and reduced customer calls for device color changes.
29. Tripsta: How Adding Reassurance Increased Conversion Rate by 25%
Tripsta, as the name suggests is a travel website that provides a wide range of transportation services from trains to airlines and ferry services.
The problem: In a competitive travel industry, Tripsta struggled with high bounce rates as users doubted the value of their bookings.
A/B testing method: Tripsta hypothesized that adding reassurance at the “Passenger Information” step could boost conversions. A message reading, “Congratulations! This is one of the cheapest flights for this route! Book today to secure this price!” was placed prominently above flight details, creating a variation for comparison.
How the original website looked like:
The variation:
How Fibr AI A/B Testing Can Help You
After exploring these 29 inspiring A/B testing examples and case studies, here is one last message for you: To do effective A/B testing, you need a simple tool designed for ease of use and efficiency.
That’s where Fibr AI shines. Trusted by thousands of brands, Fibr AI's A/B testing helps to enhance conversion rates by optimizing landing pages based on data-driven insights. This tool enables businesses to experiment with variations in headlines, CTAs, layouts, and other website elements to identify the most effective combinations for driving engagement and conversions.
Some of the prominent features of the tool include:
Smart automation: With the help of AI, the tool automatically tests and evaluates multiple website elements to determine what’s working and what’s not so you can optimize based on real insights.
Real-time adaptation: Fibr AI dynamically adjusts content based on user behavior and campaign performance.
No-code customization: The no-code editor allows users to refine landing pages without technical expertise.
Overall, Fibr AI helps brands create highly relevant experiences that resonate with audiences, ultimately boosting engagement and maximizing ROI.
Book a free demo here to see how the tool can help you optimize your website effectively!