7 data-backed steps for higher conversions

Posted by on 05 Aug

Business growth can be defined differently for many companies whether it’s user sign-ups, active users, free trials or revenue. Defining your CRO strategy is a critical first step. Then it’s a matter of setting up your analytic tools to identify problems and opportunities for higher conversions.

The problem is that it can be overwhelming to digest all the latest tactics and growth hacking strategies, not to mention choosing the best analytic tools to make the right decisions for your company.

While there are no magic templates that work for every business, there are proven tactics and strategies that can be tested and used as a general roadmap to follow.

So, here’s the question.

How do you get started? How do you determine your initial message, layout, offer, etc….. and then how do you decide what to measure and test?

Make a game plan.

A good starting point is to develop a site design that represents your best guess that’s informed by other successful companies in your industry. Keep in mind, no one gets it right the first time. You will make mistakes but these mistakes will provide valuable lessons along the way.  

When we launched our private beta, we had a decent conversion rate with regular daily sign ups. But when we launched public beta with a new website and sign up process, suddenly our conversion rate tanked.

So, what went wrong?

At first, we weren’t sure what to fix. But it didn’t take long to realize that we implemented too many changes at once. That was our biggest mistake. We did a complete makeover without having a solid process in place to manage and measure our results properly.

It was time to go back to the drawing board. We needed to build a structure for our CRO strategy so that we could implement our testing with a more scientific approach.

Coming up with a new UI and message at random is not the right approach. A/B testing without thinking about your online goals and gathering insights from your user behavior can be an expensive waste of time.

A structured approach to increasing your conversion goals is an ongoing process of improvement. You need to implement each change deliberately. Find out what works and what doesn’t with the goal of getting to know your audience. When done correctly, you can achieve good, measurable results with less time and frustration.

So what should you test?

It’s a long list to consider:

  • The overall design and layout
  • The offer
  • The copy – headlines, taglines, paragraphs
  • Social proof, testimonials
  • CTA text
  • CTA button and forms
  • Links
  • Images

Advanced tests can include but are not limited to:

  • Pricing structures
  • Sales promotions
  • Free trial lengths
  • Navigation  
  • User experiences
  • Free or paid delivery

With so many things to test, it makes sense to formulate a well thought out game plan before getting started.

According to Econsultancy, companies with a structured approach to improving their conversion rate were twice as likely to report large increases in sales from their optimization program.

You gotta have a plan.

You want to identify and prioritize your real business objectives, gather current data, understand the value proposition and personas, and making sure the right tools are in place.

This effort will pay off, every time.

With all that said, here are 7 actionable steps to build a solid foundation for achieving higher conversions…and the tools to get it done.

Step 1: Establish a baseline

Before you can make comparisons on what works and what doesn’t, you need to know how to set your expectations. How is your site performing now? Gather the analytics for your site or campaign, as it exists today before you focus on a testing strategy.

If your site, campaign, etc., is new, you’ll need to let it run, as is, for a while to make sure you have enough data to paint a picture of how it is doing. You have to know how well version A is doing on its own before you analyze how it’s performing alongside version B. This is sometimes referred to as A/A testing. You are testing what you have against itself.

Suggested Tools:

  1. Google Analytics
  2. KissMetrics
  3. Mixpanel

Step 2: Find the leaks

Before you can run a good A/B test, you need to have a good idea about what to test. The first place to start is by finding out where the leaks are. At what point are you losing people? Take a good look at your funnel and locate where your traffic is dropping off. The fastest way to find out is to dig into your analytics.

Which page in your funnel is getting the most views and how is it currently converting compared to other pages? A simple formula to figure out where you should devote your attention first is:

Page with lots of traffic + minimal conversions = TEST

Identify as many pages on your site that have a lot of traffic but have high bounce rates or are not driving enough leads based on the amount of traffic they are receiving.

For example, when we launched our public beta we had a 3 step funnel : landing page – pricing page – account creation page. Our visitors dropped off on the pricing page. This became our starting point.

Step 3: Set goals

This is one of the most important steps in the process. Keep your eye on the prize. A common mistake is setting element or page specific goals like increase overall CTA button clicks by 50%. While this seems to be a fantastic target to reach, it might not actually accomplish the end business goal; to increase revenue, signups, leads, etc.  

Conversions won’t grow your business. Paying Customers will.

Focus on goals that are specific to the overall success of the company when you are deciding what to split test. If the company’s end goal is to increase trial signups by 20% this quarter, then you need to focus on what it will take on every page of your funnel to support that goal – not just increasing clicks to one CTA button.

To accomplish this, it’s typically easier to work backward. Start with the end goal for the company and then analyze each step the visitor needs to take to support that goal.

So, in our case, if our end goal was to increase trial signup conversions on our homepage by 20%, we would look at the form on the account creation page, evaluate the pricing page and then take a good look at the main page.

Step 4: Collect the data

Once you identify which pages need improvement, you need to look at the specifics. How many visits are you getting? What’s your bounce rate? Which message resonates best with your audience? What are your customers telling you? What are they actually doing on your site?

This is where analytics, survey and heat map tools help you gather information and give you the insights you need to structure useful tests.

When studying your analytics, try measuring conversions against overall page views as opposed to unique visitors. Of course, this depends on your funnel type. If you have a landing page to get people to opt-in to a webinar then unique visits might be more relevant. But for others, many visitors will come back more than once before converting on the main page – like signing up for a free trial.

Another metric to pay attention to is the device type your visitors are using. According to Statista, mobile internet usage has grown by over 1650% and is expected to grow another 727% by 2020. This metric alone may be a big part of your conversion problem.

Mobile users behave much differently and they expect their user experience to be different than a desktop. If you watch the tech section in Google Analytics closely, you could be converting just fine on the desktop, but your overall conversion rate is horribly low because a majority of your visitors are coming from a mobile device.

Next, find out what your visitors are doing when they land on your page by measuring user behavior. Where do they go? What do they click on? Are they scrolling? What exactly are they doing?

Using heat-mapping tools like Hotjar or Crazyegg not only show you what your visitors are actually doing, they can help guide and influence the design of your page based on real data.

What are your customers telling you? Not quite sure? Perhaps it’s time to ask. Collecting survey data can help you get to know your customers better.  Tools, such as Typeform, allow you to email customers to get actionable feedback. Having valuable insights straight from your customers cuts down on hours of guesswork.

Step 5: Hypothesize

Once you’ve examined your data, you can form your initial hypothesis. For example, if people are coming to your site, and you can see that visitors are filling out the form but they are not completing it – try shortening the form from 6 fields to 3 fields.

Or, let’s say the form is located in the bottom half of the page but you find that your visitors aren’t reaching the form because they generally don’t scroll more than 45% down the page. In this case, the color of the button and headline copy aren’t a priority to test. Instead, try rearranging the placement of your elements so your form is easier to locate.

Word of caution: Don’t test too many hypotheses at once! If you are testing too many at the same time, it will be impossible to know which change resulted in a positive outcome. Take it slow. If you want to do it right, think of it as a process of gradually fine tuning your user experience. Every test should be weighed and either built upon or thrown out.

Bottom line, data is useless unless you act on it. Gather your actionable insights from your analytics, surveys, and heat maps and start putting your hypotheses to the test.

Step 6: Create a variation

You now need a B to compete with your A.

Early on, test against a variable that represents big sweeping changes. As you move more toward fine tuning a campaign (remember, you are always testing), the variance between A and B should get smaller.

An example of a smaller variation could be simply testing a different adjective for the headline. Or maybe you’re testing a green CTA button against a red one.

Be careful not to make assumptions like ‘button color doesn’t matter’. HubSpot said that their A/B test on button color showed that the red button scored a 21 percent increase in CTRs over the green button. Test. Don’t assume.

Get help if you need it. Split testing is easy, but it may not be something you want to try on your own. Conversion XL recently had a lot to say on this very topic. So do the research and if you need help, find a service that will run the tests for you while you manage other important tasks, which I’m sure you have.

Here are some of the tools we recommend:

  1. UnDelay, an adaptive landing page builder with an intuitive user interface for creating and optimizing pages and funnels, fast.
  2. Optimizely,  is a great tool if you are testing a full website.

Whatever tool you choose, make sure you run the test long enough. Just as you need enough data to establish a baseline, you need to test data that truly represents traffic over a substantial period of time.

How long that takes to get a quality test depends on several variables, such as your traffic volume and the number of elements being tested. In general, most CRO experts agree that unless a test runs for a minimum of 30 days, it’s difficult to determine an accurate ‘winner’. So, the longer the better… however some companies will make exceptions to this rule depending on their traffic volume and stage of their business.

So, even though opinions and testing tools differ, a rule of thumb to use is you want at least 100 conversions per variation. The statistical confidence, which your testing tool will report, should be 95 percent or better. If you have 100 conversions per variation and a confidence of 85 percent, it’s recommended to keep testing until you have reached a variation that has a 95% confidence score.

Step 7: Measure results and keep testing

Think outside the box. Consider that what is converting on a desktop might be failing miserably on mobile. So make sure to measure results by device type.

Once you implement a test and let it run, it’s time to gather the data, find the winner and do it again. Never assume that you’ve nailed it. Quite often, additional tests will prove that you haven’t.

 

Share this post :

  • https://www.linkedin.com/in/ozairakhtar Ozair Akhtar (Oxair Online)

    It was really another nice post… can I become a contributor too ???