- What is A/B testing?
- Why is A/B testing important?
- Examples of top of the funnel A/B testing
- Examples of Middle of the funnel A/B testing
- Examples of Bottom of the funnel A/B testing
- Examples of A/B testing
- Obama increased campaign contributions by 5% by testing the form
- Making menu items more obvious increased engagement 650%
- Reassuring visitors increased Highrise signups by 30%
- Performable increased clicks by 21% by changing button color
- Changing one word increased clicks by 161.66%
- A/B testing tools
- A/B testing tools for email marketing
- MailChimp
- Intercom
- Getresponse
- A/B testing tools for landing pages and blogs
- Optimizely
- Instapage
- Tracking and analyzing your A/B tests
- Learning from A/B tests: optimizing into the future
Aren't you curious when you read all about how a company you know managed to get thousands more leads overnight with a simple tweak?
... Or when one change boosts their revenue by thousands?
Don't you wish you could do it yourself and start to really tighten up your marketing?
In this article I'm going to explain the method that all of your competitors are using.
It's the method you need to master if you're going to succeed online — A/B testing.
Ready to start improving your business, lead generation, and revenue like never before?
Well, first of all you need to know exactly what A/B testing is.
What is A/B testing?
A/B testing is comparing two or more versions of a website, email, app, or other strategy against each other to find out which variation performs better.
For example, you might write two subject lines for your welcome email:
- Welcome to Reliablesoft.net
- Hi! We're so happy to have you here
You'd randomly send one subject to half of new subscribers, and one to the other half, then track which had the highest open rate after a certain period of days.
Knowing which subject got the email opened more often will give you the knowledge you need to update the email, and therefore improve your open rates.
Simple, right?
Well, running one A/B test will help you improve your chosen metric a certain amount, but the more you run and the more knowledge you gain, the better your business will perform.
There are tons of things you can A/B test, including:
- Homepage headlines
- Call to action (CTA) copy
- Social sharing buttons
- Items offered in a contest
- Signup forms
And, thanks to the variety of A/B testing tools out there (more on that later), you can experiment with pretty much anything.
There's never an excuse to not be A/B testing at least something...
Here's why:
Why is A/B testing important?
It's a sad fact, but there's only so far you can take your business by reading up on best practices and making shots in the dark.
Your audience and market is unique, so no one can definitively tell you which subject line is the best for reducing cart abandonment — they can only tell you their experiences. And what if that doesn't work for you?
The only way to make smart, data-backed decisions about changes to your site, app, or marketing material is to look at the metrics, come up with a hypothesis, and test it yourself.
Think of A/B testing as tuning up a machine. There are certain parts of the machine that are more important than others.
If only 50% of people can manage to open the door of a car you built, that's a much bigger issue than if 5% complain of a strange humming noise.
That's why A/B testing is something you do at every stage of the funnel.
Let's look at the funnel for an example software company:
Examples of top of the funnel A/B testing
Let's say that the ways leads find out about your software company is mostly through PPC ads. You want as many qualified leads to click your PPC ad as is possible, so you set up an A/B test to see which ad performs the best.
You might change the copy, the URL and the keyword targeted, making incremental changes until your ad gets a higher clickthrough rate and lower cost-per-lead.
After clicking on your ad, visitors will be taken to a landing page. You could test the headline, CTA copy, or design of the page to see if you can bring more leads into the top of the funnel and move them through the process.
Examples of Middle of the funnel A/B testing
It's no good bringing in leads if you're going to lose them before they get a chance to be sold on the benefits of your software.
After coming in at the top of the funnel, leads need to be educated about what your product can do for them.
That means getting them on your email list, on a free-trial, and sending them educational material that helps them decide that your product is their best option.
Here you'd A/B test your welcome email, subscriber email sequence, the kinds of content that gets clicked and whether a certain test group is converting higher than the rest.
Examples of Bottom of the funnel A/B testing
In this vital stage, the customer is close to making a decision about buying your software. You'd test the messaging of your sales outreach emails, sales scripts, demo tactics and discount amounts.
Putting all of this together, you will end up with a finely tuned sales machine that brings in the right leads, nurtures them with the right education, and then can effectively close the deal.
But here's the thing about A/B testing...
I'm not saying that every test will be groundbreaking. A lot of the time, tests can do absolutely nothing. And the big wins that you read about are likely the result of many tests and no way as simple as people make it out to be.
The important thing about running as many tests as you can is that you at least have evidence for why you wouldn't want to bother moving in that direction.
Examples of A/B testing
You never know how an A/B test will turn out until you run it...
Maybe you will actually strike gold, like these examples below:
Obama increased campaign contributions by 5% by testing the form
An A/B tester for the Obama campaign increased overall donations by 5% by making the donations form multi-step, instead of being all on one page.
This worked because it didn't instantly overwhelm the donator with all kinds of fields, but instead got them to make an investment first by making an easy decision — donation amount.
After they've clicked it, it feels like a commitment so they are more likely to follow through and input billing details.
Here are the variables:
It works in a similar way to the incremental commitment sales tactic. The deeper a visitor gets into your funnel, the less likely they are to back out at any given time.
You can emulate this by breaking your signup form into steps or removing some fields from your email opt-in box.
Making menu items more obvious increased engagement 650%
Most of the time, visitors aren't willing to interact with something if it isn't blindingly obvious. That was the case for Luke Wroblesky, who tested a nested menu in his app and found a huge drop of 650% in engagement.
The lessons you can learn from this?
If you want a visitor to click a link, you need to put it right out in the open. That might mean breaking down an element like a hamburger menu into its individual items, or moving a call to action to a more prominent place.
The proof behind this comes from eye-tracking heatmaps.
We know that readers' eyes move in an F-shape along a page or screen, like this:
So with that in mind, design your pages so they highlight the most important elements to get more clicks and higher engagement rates.
Reassuring visitors increased Highrise signups by 30%
Often, testing your landing page's headline can be the single best way to improve overall business performance. A simple test like the one Highrise carried out can give you a massive boost in leads.
Control:
Winner (+30% signup rate):
By emphasizing the 30-day free trial prominently in the main headline, visitors felt reassured and like they had nothing to lose by signing up and giving it a go.
The strong call to action of 'Pick a plan to get started!' could have also had an effect, especially since it tells the visitor the exact next step they need to do, instead of a vague instruction.
The power of guarantees has been proven time and time again, and you can see it even on opt-in forms with copy like 'We promise never to spam you' or ‘I hate spam as much as you do’, like I am using below my email subscription sign-up forms.
Performable increased clicks by 21% by changing button color
Colors can have a profound effect on your reader's response to your site. Think about it in the extreme and imagine a site that was made up of the world's least favorite colors.
Below is a simple but legendary A/B test carried out by Performable. By just tweaking the color of their main call to action button from green to red, they increased clicks by 21%.
HubSpot's analysis of the test is that the color didn't matter as much as the contrast. The green in Performable's logo meant the call to action button was dampened and got blended into the background.
Changing it to red — green's perfect contrast — made it stand out and make visitors want to click.
Changing one word increased clicks by 161.66%
Never underestimate the power of one word A/B tests. Sometimes they can have insane results.
Software firm Veeam tested the wording the call to action that would put visitors in touch with the sales team. Here are the two versions:
Control:
Winner (+161.66% clicks):
'Request a quote' sounds like it will be slow, take time, and instantly connect you with a sales rep whose life will now be devoted to hounding you.
'Request pricing' sounds better because it implies the pricing is fixed, and all you have to do is request and learn more. Obviously that's how Veeam's visitors felt, too.
A/B testing tools
Speaking generally, there are two main environments you can run A/B tests in: email, and your website.
In this section, I'm going to go over some of the major platforms that let you run A/B tests in those two environments.
For email:
For web:
A/B testing tools for email marketing
A great way to improve open rates, click-throughs, and sales is to run tests on your email marketing.
By tweaking the subject line, send time, 'from' name and body text, you can find what resonates with your audience and reach them more effectively.
MailChimp
Running tests with MailChimp is easy. You choose a variable (subject, send time, 'from' name, body text), choose which percent of your list will get the test content, and then view the report:
You can either use the data to make better decisions next time, or set it so that the winning combination is automatically send to the rest of you audience after a smaller segment has been tested.
Intercom
For software companies sending transactional messages and marketing messages, Intercom is a solid choice. You don't only get to test the periodical newsletters and blog post announcements, but also things like 'complete account registration', and 'invite your team', etc.
Similar to MailChimp, you can see the results in a report and choose the winning variable:
Where Intercom has the advantage is that it's also for testing transactional messages, and even in-app messages if you run a SaaS company.
Getresponse
While GetResponse isn't strictly a dedicated email marketing platform, it does sneak into this list for a simple reason: if you're already using it for your webinars, landing pages, or marketing automation, you may as well use it test emails, too!
With over 350,000 customers, it's a fair bet that there may be a few readers that use it, and aren't taking advantage of its fantastic A/B testing functionality.
GetResponse lets you test content, subject line, from field, time of day and day of week.
A/B testing tools for landing pages and blogs
Businesses that make a lot of their money from direct response email marketing can get a huge lift in revenue from A/B testing communications...
But for most of us, testing landing pages and blogs is where the real money is.
Optimizely
With Optimizely, you can test anything. And I mean anything.
If you can see it, you can test it. After installing it on your site, its interface lets you put in a URL, select an element on the page (button, title, form), and manipulate it without any need for coding.
Create multiple versions with whatever kinds of goals you want to test (overall engagement, clicks on a certain link, custom events) then simply let the test run.
Optimizely will tell you how many visits and goals are needed to reach statistical significance and generate detailed reports that give you easy-to-read insight.
Instapage
Instapage isn't as advanced as Optimizely because you're not able to test things that you don't make with the app. The app's essentially a landing page builder, so if you need that too, Instapage is a sound bet.
Other features include code-free design, fully customizable landing page templates, and hundreds of presets to edit.
Tracking and analyzing your A/B tests
Assuming you're using different platforms to test different parts of your product and marketing, you need a central place to store it all.
What's the answer?
Well, as it is in 90% of cases: a spreadsheet.
Love 'em or hate 'em, they are simply the best places to store, filter, and analyze structured data.
Thankfully, you don't always have to start from scratch. In fact, Optimizely will let you export your A/B test data as a spreadsheet.
If you're running A/B tests in another app, too, you can just make sure the headers match up and then merge them together.
If you opt to make your own from scratch, Sarah Hodges (co-founder of Intelligent.ly) suggests including:
- Start and end date
- Hypothesis
- Success metrics
- Confidence level
- Key takeaways
An example:
Learning from A/B tests: optimizing into the future
When you have it in front of you, it can be easy to be blinded by a fake certainty of what the data actually means.
Sure, you find that one headline did better than the other. But why? Without answering the 'why', you won't make better choices in the future or learn anything from your tests.
That's why it's best to first set up tests that lend themselves well to statistical significance.
For example, let's say you're making changes to a landing page and find that it boosted conversion by 43%. Nice!
The problem comes when you find that the changes you made were changes to the copy, the button and the headline. Then what? How do you know what made the impact?
If there's any one solid best practice to be learned before you start A/B testing, it's that you need to test one element at a time or you're going to pollute the data.
Regardless of whether a polluted test 'wins' or 'loses', you're losing out on the knowledge and not going to be able to make informed choices in the future.
Have you run any particularly groundbreaking A/B tests? Let me know in the comments.