Tuesday, January 15, 2013

11 Obvious A/B Tests You Should Try

ab testing experiments

If you are looking to squeeze more dollars out of your existing traffic, you need to start running A/B tests. If you have at least 10,000 monthly visitors, you should consider running 1 new A/B every other month, if not once a month.

With my business we typically run 1 A/B test every 2 weeks and although many of the tests fail, we usually find a winner 1 out every 4 tests that boosts our conversion rate by at least 20 percent.

One of the main ways I’ve been able to have great success is by learning from other entrepreneurs. Each week, a group of entrepreneurs, including me, discuss A/B tests that we had success or failures with. We share data with each other, which then helps all of us come up with new A/B tests to try.

Here are 11 obvious A/B tests you should try:

Test #1: Add the word FREE in your ads

Eric Siu from TreeHouse manages thousands of dollars in ad buys each week. One of his main channels of acquisition is remarketing. He tested out a lot of different ad types, but found his cost per acquisition (CPA) to be around $60. He changed the color of the ads, the call to actions and many other elements within the ad, but none of them had a major impact on the CPA.

He then tested adding the word “FREE” within his ads.

ab testing experiments treehouse ad

That one word resulted in his CPA to decrease from $60 to $43 a signup.

Test #2: Create an explainer video

I’ve created a handful of explainer videos, but they were all done wrong. Once I learned what elements needed to be in an explainer video to help boost conversions, I instantly saw an increase in our conversions.

By adding a video that had the same exact message as our homepage copy on CrazyEgg.com, we were able to increase homepage conversions by 64%. The big lesson I learned there was that people don’t always like reading text, but they are open to listening to a short video that explains a product or service.

Test #3: Have your signup button scroll with the visitor

On TreeHouse’s library page they noticed that people were reading their content on and scrolling down, but they weren’t clicking on the signup button. So at first they tested changing the color of the signup button from grey to green.

The change in color had somewhat of an impact, but it didn’t have a large enough impact. So they tested a concept similar to what Facebook does… in which their main navigation bar scrolls with the reader. And because the signup button is in the navigation, it would cause people to notice the button.

ab testing experiments treehouse nav

This simple change increased conversions on this one page by 138%.

Test #4: Removing forms fields

On NeilPatel.com I collect leads from individuals and companies who are interested in increasing their online traffic and more importantly online revenue. My submission form contained 4 fields:

  • Name
  • Email
  • URL
  • Revenue

I didn’t think that having 4 form fields would affect my conversion rate because it doesn’t take too long to fill them all out. I ran a quick test to see if replacing the revenue field with a open field asking “what can help you with” would affect conversions as some people may not want to share their revenue.

That test didn’t have an impact on my conversion rate. I then decided to remove the “revenue” field all together and only have 3 form fields.

ab testing experiments neilpatel forms

That boosted the conversion rate by 26%.

Test #5: Create a two-step checkout process

I was a big believer that reducing the amount of page loads and steps people would have to go through, would help increase conversions. Because of this Crazy Egg had a simple checkout process… in which you would first select your plan and then create your account and enter in your payment information on the second page.

ab testing experiments crazyegg 2 step

Conversion Rate Experts wanted to me to test a 3-step checkout process. In which you would first select your plan, then be taken to page where you create your account, and then be taken to another page where you enter in your payment information. The total amount of form fields where the same as the 2-step checkout process, but instead we were just breaking it out into 2 separate pages.

After a total of 817 conversions, we had a winner… the 3 step check out process had a 10% increase in conversions.

Test #6: Show a live version of your product instead of using screenshots

With most software companies, they have a tendency to show screenshots of their application versus letting people play with the real thing before they even signup.

Qualaroo used to show screenshots of their application on their homepage, but through surveying they found that people didn’t fully understand what the product did. So they decided to put their own product on their homepage and let people play around with it.

ab testing qualaroo demo

By embedding a live version of their product on their homepage, that people could interact with, they boosted their conversion rate by 38%.

Test #7: Free trial versus money back guarantee

I used to think that there was no difference between a 30 day money back guarantee and a free trial that required you to put in your credit card in upfront. Why? Because if you weren’t happy with the product within the first 30 days, you wouldn’t be charged for it.

Boy was I wrong!

We tested a 30-day money back guarantee versus a 30-day free trial and the results were huge.

ab testing experiments crazyegg free trial

By replacing all of our money back guarantee badges with free trial badges, and by placing “30 day free trial” verbiage on every page of the Crazy Egg website, we were able to boost signups by 116%.

Test #8: Trial length

The longer your free trial the better, right? That’s at least what I thought until my co-founder wanted to test a 30-day free trial versus a 14-day free trial on KISSmetrics.

ab testing kissmetrics trial length

When he tested the 14-day free trial versus the original 30-day free trial, there was no difference in front end conversions. The same amount of people signed up for each trial length. But the big difference was that there was an increase in usage of the product by 102% of the people who signed up for the 14-day trial versus the 30-day trial.

We quickly learned that reducing the trial length made people feel that they had to use our product as soon as possible. While with the 30-day trial people felt that they had a lot of time and they forgot about using it even though we sent email reminders to them.

The extra usage helped boost revenue as more customers experienced the power of KISSmetrics.

Test #9: Offer time based bonuses

I used to sell the QuickSprout Traffic System for $197 dollars in which you would get an Internet marketing course delivered to your inbox that would teach you everything you needed to know about digital marketing.

At first I didn’t offer any bonuses, but then I decided to include a few for free. The main bonus was a video course and a free software plugin. Those 2 bonuses only boosted my conversion rate by 11%.

Michael Williams gave me the idea of running time-based bonuses in which the first 50 signups and the first 100 signups would get something that others didn’t receive.

ab testing quicksprout bonus

The end result was a 47% increase in conversions by offering time based bonuses that encouraged people to signup now versus later.

Test #10: Add a dollar value to your free offers

Not everyone is ready to buy right away. Some people want to learn more, get to know you or your company, and once they trust you, they are open to buying whatever you maybe selling.

Due to this, it is important for you to collect the email address of each individual who is interested in buying your product or service, but isn’t ready to pull the trigger yet.

Even though I am not really selling anything on Quick Sprout, I still collect emails so I can notify you when I write a new blog post. I used to just ask you for your email address without offering you anything in exchange. I then tested offering you a free eBook and 30-day course, which only boosted conversions by 6%.

But once I placed a dollar value on that free information, such as how I offer you a free $300 course in my sidebar, my conversions went up.

ab testing quicksprout money

By placing a dollar value on the same free information I was offering you before, I was able to boost my email opt-in rate by 22%.

Test #11: Button colors

Around 4 years ago I was speaking at a conference in which the person on my panel ran A/B testing at Gmail. He was telling me how they tested over 50 shades of blue and found the one shade converted the best for them. Now you probably can’t test 50 shades of a color on your website, as you won’t have the traffic volume that Gmail has, but you can test a few variations of button color.

One of the button colors that I would have never guessed that boosted conversions was the color red. Performable ran a test in which they tested a red call to action on their homepage versus a using green.

ab testing red green button

Surprisingly, the color red had a higher click-through rate by 21%. This just shows that you can’t assume that one specific color is not worth testing as the color red typically has a negative connotation with it… you typically see it associated with stop signs and error messages. For that reason I thought the color red wasn’t worth testing.

Test #12: Tell people to come up and talk to you

What would a list post be, with out a bonus. I know I said I have 11 A/B tests for you, but I actually have 12 ;-) .

My buddy Leo was in a coffee shop when he saw something that he’s never seen before. Someone had a cover on their laptop that said this:

ab testing laptop cover

He asked the gentlemen how many people approached him because of his laptop message and he said “a few dozen within 3 weeks”. If you are looking to increase how many people come up to you, buy a cover for your laptop that tells people what you do and that they should come up and talk to you.

I’ve haven’t tried this out yet, but I will be as it is a great way to potentially get new customers for your business.

Conclusion

As I mentioned earlier, I run a lot of A/B tests that fail. Just because the ones above showed great success it doesn’t mean that every test you run will be successful. More so it just shows that you can get a lot out of A/B testing, you just have to put the time and energy into it.

Have you run any other A/B tests that have done well or failed miserably? If so, leave a comment explaining the test and results.

Source: http://feedproxy.google.com/~r/Quicksprout/~3/9yyYoo_E7p8/

Nicholas D. Chabraja Jaime Chico Pardo Donald Rudolph Voelte, Jr. Jeffrey L. Bleustein Nobuo Katsumata

No comments:

Post a Comment