Skip to content
🎉 Last day to SAVE $400 on MozCon. Get your tickets before prices go up on April 20! Register now 🎉
558ad9416671b6 59674537

Table of Contents

The Bearodactyl

Case Study: How Using Color Psychology Improved Our Website

This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.

Changing a button holding call-to-action (CTA) text from green to yellow resulted in a 187.4% increase in conversions on our website. This year, our company decided to start using A/B testing as part of our efforts to improve the conversion rates for one of our landing pages. We are a leading affiliate site in the online slots niche, with over 10 years of experience in the industry. Our main business goal is to deliver our users a Vegas slots experience, but with additional benefits.

Our online experience has taught us to always stay abreast of new developments in online marketing, from UX and SEO, to CRM and PR. Most of all, though, we love the things you can learn about people and industry through tweaking and testing. Our user metrics are among the best in our niche, which shows that users find our site very helpful and informative.

In starting this project, we were under no illusions that we would be generating any revolutionary ideas in the fields of split testing, and we realize that it is something we should have started doing a lot sooner. Having read about the success of many split testing campaigns in the past (Moz included), it still annoys us that we waited so long before introducing this kind testing to our continuous processes. We hope that users reading this post appreciate how simple, yet effective, implementing basic split testing can be.

room4improvement3.jpg
Source: motivationalcartoons.wordpress.com

Identifying room for improvement

The online slots niche attracts users who like the excitement and dynamism of slot machines. In a casino, the atmosphere is intended to resonate with this excitement, and colors play a big role in this: gold, red and orange are colors traditionally used by casinos to create a thrilling experience.

Our goal at VegasSlotsOnline.com is to replicate this thrilling slots experience online. However, as is the case with many online offshoots, replicating an experience on the web is not without its challenges. We aim to identify the elements that have a greater impact on our users, encouraging them, in turn, to sign up in exchange for using the free online slots.

Our focus to this point had been placed on the creation of a unique experience, but in the process we realized that we had overlooked the importance of optimization. In particular, we had failed to see how optimization could help us establish learning points that would improve our ability to get users to sign up from our site.

The conversion performance of the site was not bad. We were accustomed to approximate conversion rates of 10–15%, but we knew there was room for improvement. Initially, and incorrectly, we thought we needed to put a ton of cash into our site for that to happen. It was only when we became aware that small changes could increase conversion rates by 5%, 10%, or more that we decided to divert our attention to discovering the little changes we could make to help our company increase its signups.

We decided to focus on the “Sign Up Here!” button on the free games pages. The button was no particular cause for concern, but it seemed like an obvious place to start.

We chose one of our highest trafficked pages for the experiment.

When designing the “Sign Up Here!” button, we followed well-known Call to Action Best Practices such as using high contrast colors, as well as paying attention to placement and size. Yet we found out quite quickly that the color of a button was by no means a definitive choice.

The button we started with was plain green, without gradients or shades. According to the psychology of colors, green is associated with natural environments on one hand, and with “Go” or moving forward on the other. The next obvious choice using this logic was yellow, which connotes happiness, cheerfulness, and is used in many brick and mortar slot machines.

Both choices had their merits. Green was a good option, because we wanted our users to see our CTA and move forward by signing up. Yellow was also a good choice because we knew that its associations with happiness and excitement definitely worked well in our niche.

Curiously enough, there were also arguments against each color. The same psychology of color states that green can lack dynamism, especially with darker or bluish shades, conveying coolness and stability. At the same time, yellow is said to be more effective with a younger audience who quickly respond to it.

So, neither color was a bad choice, but neither was an evident best choice either. For us the fact that, at least in theory, the issue could not be settled provided the perfect opportunity for A/B testing.

Designing the experiment

When designing the experiment, we decided to create three variations to test the following:

  • The effect of a change in the color of the “Sign Up Here!” button
  • The effect of a change in design of the “Sign Up Here!” button
  • The effect of a change in the CTA of the button

We wanted the A/B split testing to capture substantial changes. We also wanted to be able to have statistically significant results in a period of no longer than five weeks. Given the number of monthly visitors we received (around 200,000), we knew that within that timeframe the sample size, if we created three variations on the baseline, would be more than robust enough to allow for effects larger than 2% to be captured with 95% statistical confidence.

The baseline conversion rate was 11% on the designated page. Our main goal was to be able to evaluate whether a change in the color of a button containing a CTA would have a clear and measurable impact on the conversion rate (for that button).

We had two secondary goals, which is why we tested two additional variations. First, to compare whether changes in the button design would have a higher impact on the same conversion rate — in a way we wanted to know if the problem was button itself rather than the color. And second, we wanted to know whether our current CTA text could be the real issue, vs. the button itself.

Original: The original green button

casestudy-original.png

Variation #1: From green to yellow

casestudy-green-yellow.png

Variation #2: From original to Shoemoney-style button

casestudy-shoemoney.png

Variation #3: From ‘sign up’ to ‘play now’ CTA text

casestudy-playnow.png

Results

The results showed a clear response of the conversion rate to the changes made.

casestudy-results.png

The Minimum Detectable Effect (MDE = BCR +/- (MDExBCR)) was 13%–14%, given the size of our samples in each variation (between 5,532 and 5,973). The test was a one tail test, because we were not interested in capturing negative effects.

Results of the improvements were well above the 14% MDE we needed to establish statistical confidence.

As we hoped, testing different variations provided us with clear insights on the impact of color on the effectiveness of the “Sign Up Here!” button. Interestingly enough, changing the button design did not achieve higher conversion rates than changing the alone. The improvement in the conversion rate for changing the button from green to yellow was a staggering 175%.

To our surprise, using the CTA "Play Now!" did not result in improved conversions against the baseline CTA. However, when looking closer at the results for variation #3, we realized the sample size for testing variation fell short of the required numbers for the test to be statistically significant.

Given the sample size for our variation #3 (5,532), our test was not sensitive enough to measure changes in the conversion rate that would fall between 13% and 14%. (The conversion rate for variation #3 was 10.71%.) The minimal variation of the conversion rate for this test meant we could only draw limited conclusions based on that result. However, thanks to the results of the first two variations, we could at least be certain that changes in button color had a clear effect on our conversion rate.

casestudy-graph.png

The above graph shows the conversion rates for the different buttons: blue, original; orange; variation #1; green, variation #2; and red, variation #3.

Lessons learned

What could we have done differently?

We should have spent more time brainstorming about the goals of the test and about the components to test.

It may be worthwhile discussing their relevance and keeping in mind how they will affect your test; the greater the number of variables you have, the higher the number of combinations you will need to test.

If you are keen on testing many of your page’s components, then it is worth bearing in mind that there are other types of experimental designs that work similarly to A/B testing, such as Plackett Burman design and Fractional-factorial design. They are not as simple to set up and run as an A/B test, but may give you results that are more reliable and accurate.

Lastly, it is crucial to avoid the temptation of stopping your experiments once you see positive results. Doing this increases the chances that your results are wrong.

Benefiting from your A/B test

The first step of an A/B test consists in breaking your page into a series of basic components. Make sure you separate each of the elements you want to test accordingly. Multivariable matrices are the best way to keep track of your testing-elements (see below).

Also, avoid confusing objectives. You may want to test click-through rates, response rates, registration rates, etc. The key is to never try to do more than one thing at a time or to test the wrong objective (e.g., testing your bounce rate, for example, instead of your conversion rate).

Two%20Element%20Test%20Matrix.png

The multivariable matrix allows you to keep track of your control and test groups, and the different combinations (or recipes) that you need to test for. In this example, suppose you want to test two elements on your page: 1) your current price against a new price and 2) your current 10% discount against offering a free gift. These two elements can be tested as two separate A/B tests.

For each test, you must create four combinations or recipes. For the first test these are:

  1. Original price without an offer (control)
  2. New price without an offer (test)
  3. Original price with an offer (control)
  4. New price with an offer (test)

Using the matrix, you can label your control as (-) and your new component as (+). The A/B column is used only to capture the interaction effects, which in this case is the impact you get from changing both elements at once.

Structuring your test in this way enables you to ensure it is conducted properly and, because of it, your results will be statistically valid.

What problems can you expect to encounter?

Sample size is perhaps the most common problem one encounters when performing an A/B test. Given that traffic determines the time needed to reach the required size, it can be hard to remain patient and focus on the reliability of the test’s results.

Next to sample size, measuring the wrong components or less relevant components can also be a typical problem, which is why I mentioned recommended brainstorming. There is no right or wrong answer in terms of the components that you want to test; yet it is important to understand it's not practical to test a large number of variables. The more variables you want to test, the larger the number of combinations you will have to consider. The number of combinations is a factorial combination, which means that in testing only four variables you already have 24 possible combinations, and with five you have 120.

Therefore, don’t fall for the temptation to test everything. Use a multivariate matrix to disregard combinations that you know from the beginning will probably not have an impact on your conversion rates. There is no rule of thumb with respect to this, but experience with past testing will always serve as a precedent that should inform future testing practices.

f4dnd.jpg

Measuring success

The first thing to keep in mind is testing pages that have a high bounce/exit rate will increase your chances of success; as our own experience showed, capturing substantial changes is much likelier than capturing marginal changes.

The success of your test depends 100% on the proper set up of the test’s basic parameters, such as samples sizes and control vs. treatment groups. In this regard, making sure you always account for control and treatment levels for each of the identified components is the safest way to ensure the validity of your test results.

Also, to ensure that your success goes beyond higher conversion rates, you'll need to continually test. By doing so, your company will learn not only about its page elements, but also about its consumer base.

Philosophical takeaways

Rethinking colors and design

  • A simple color change can have a greater impact on conversion rates than bigger design changes
  • If buttons have different colors, chances are they have different conversion rates, even when both contain the same CTA text
  • Keep experiments congruent to ensure an easy to digest data sheet at the end

Coherence and user experience

  • Psychological effects of color do matter. In our case, we chose two colors, both of which produced convincing arguments for their use. Our test likely would not have been fruitful if we had used white or black buttons. Our niche’s characteristics were paramount to our color selection.
  • You will find all sorts of arguments for the use and psychology of every color, and the positives and negatives for each. However, there are no all-encompassing rules for the use of color psychology during such testing.

Zero-cost changes

  • It's best to begin testing smaller design changes, then move up to testing bigger changes if the evidence justifies it. We find this to be less risky and more manageable for a small company.

Try it yourself

Below is a brief guide on what you should do to get you started. There is enough in this area alone to do another blog post, so I will try to keep it concise.

  1. Sign up for a free trial at Optimizely. (You won't need a credit card.) We also suggest reading the starter guide.
  2. Identify what action you'd like visitors to take, for the purposes of the test, and this will help you to decide what to split test. There are several elements that can be tested, but your goals should dictate what's ultimately tested. The important thing to consider is your page must receive enough traffic to make the experiment statistically relevant in a timely manner.
  3. Once the element to be tested is identified, you can then look at how it can be changed. Optimizely has a cool interface that allows for some basic positional and aesthetic changes. I'm sure that if you have some basic developer skill,s you can use much more advanced options than we have by using the "Edit HTML" button.
  4. Don't be afraid to ask for help. Optimizely has a very responsive support team. We have had several 30-minute phone calls and countless emails with them when issues arose relating to geotargeting content and testing variations.

optimizely-elements.png

Are you ready to start testing? If we did it, anyone can.

Back to Top
The Bearodactyl
I own and run an Internet Marketing company from Malta. Focused predominately in the online gaming niche(s), I enjoy the challenge of every outreach email ever sent being treated with utter contempt and rejection. I have a huge interest all aspects of online marketing and am usually found grinding through Majestic SEO.

With Moz Pro, you have the tools you need to get SEO right — all in one place.

Read Next

The Top 4 Skills Marketers Will Need in the Future

The Top 4 Skills Marketers Will Need in the Future

Apr 16, 2024
Feeling Stuck: What to Do When You Don’t Know What to Do

Feeling Stuck: What to Do When You Don’t Know What to Do

Apr 09, 2024
How Reading Patterns Can Help Us Optimize Content for Organic Search — Whiteboard Friday

How Reading Patterns Can Help Us Optimize Content for Organic Search — Whiteboard Friday

Mar 29, 2024

Comments

Please keep your comments TAGFEE by following the community etiquette

Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.