fbpx

Our Blog

How to build CRO into your website redesign

Post updated April 2024.

The website redesign is not what it was. ‘Radical’ website redesign – where you carry out a ‘big-bang’ overhaul of your site – is becoming less common these days. It’s now been replaced by what’s known as Evolutionary Site Redesign, or ESR for short.

This is a good thing, and there are a number of reasons why.

Firstly, it is just less risky to iterate on an existing website than to start from scratch.

It’s also generally cheaper, faster, and more measurable. With ESR, companies are able to make educated, incremental changes to their existing websites that can be validated.

Perhaps most importantly, the ESR approach tends to be rooted in data insight. Radical website redesigns traditionally aren’t.

I’ve lost count of the number of times I have worked with companies who have launched slick-looking brand new websites, only for their sales conversion rates to plummet at launch.

Not to mention what happens to search engine rankings and traffic volumes.

However, this doesn’t mean that radical website redesigns don’t still happen.

Sometimes there isn’t the option to do ESR, as the decision has already been made by senior executives, or the CEO.

Radical website redesigns are often the result of:

– ‘digital transformation’ or re-platforming

– an existing website not being mobile-friendly, or

– an outdated look and feel.

All is not lost if you find yourself in this situation, though. There are things you can do to minimise the potential impact on performance that a radical site migration might have.

Devise a conversion rate optimisation strategy in the website planning process

You can think of ESR as conversion optimisation on a large scale. But even a radical design approach can incorporate CRO.

By adopting conversion rate optimisation methodologies you can plan how to iterate on your new designs as soon as you launch – and not lose control over the performance of your new site.

Just as you need to consider SEO during the scoping and planning phases, you should also lay the foundations for CRO at the same time.

And there’s a great opportunity to use insights from an existing live site to feed the experiments that you can run as soon as the new site goes live.

Too many times I’ve seen companies addressing conversion after the website has gone live. This is too late, and generally results in the need for unwieldy design changes further down the track that could easily have been avoided. It’s not recommended practice.

A 10-step process

You shouldn’t underestimate the importance of planning and prioritisation in the overall CRO process.

In our methodology at Acquitain Digital, steps 1-8 are all about planning and prioritising: gathering data insights, user surveys, user testing results, site drop-off reports etc. The actual testing and analysis of results constitute only the last two steps.

Scrimp on the data gathering at your peril.

In my experience, the data gathering phase for CRO can sit nicely alongside keyword research for SEO and content in the pre-site build timeline.

In fact, many of the insights that this research yields are important for the site scoping process – for instance:

Customer survey dataUX reports
User testing resultsHeuristic studies and research
Page and form field abandonment dataLive chat transcripts
Heat mapping dataSession replay recordings

This is all invaluable information that can feed straight into the wireframing/prototyping phase of the website redesign project.

How this approach fits into the overall redesign process

You can see our recommended process in the two diagrams below. The first shows the various steps within the scoping phase of the project, broken down by discipline (e.g. Search, CRO, content, UX).

The second outlines the development phase of the website redesign project, with the CRO disciplines included throughout the timeline. This shows how experiment ideation and prioritisation, tool set-up, QA and implementation can happen at the same time as the development sprints.

Agree and prioritise your areas of focus

But with so many options open to you, you have to decide which areas you’re going to focus on, and then prioritise them.

Considerations, and questions you should ask yourself, are:

– What’s the primary objective of this website redesign?

– What are the pages on the site with the most traffic?

– What are the highest value pages on the site? (these are often checkout pages and product details pages)

– What are the areas we already know that visitors have difficulties with?

Once you’ve identified the answers to these questions, you’ll be in a much better position to plan your attack.

De-risking a re-design launch

I have worked on website redesign projects where we essentially ran an A/B test between the old site and new, rather than releasing a beta version.

This is an effective way of de-risking the site launch, since you can serve the site to a small proportion of the total audience, and rapidly identify and fix issues.

An example of this approach was with the UK general insurance company Direct Line.

If the issues on the new site are major, there is always the option of ‘throttling back’ the new site to 0% (effectively switching it off).

Most enterprise-level A/B testing tools (we used Adobe Target) give you the ability to scale up the size of the audience that you want to expose the new site to. So you can start with, say, 5% and throttle up on a weekly basis as you fix outstanding bugs.

The biggest challenge is establishing which success metrics you want to track – and not having too many.

Direct Line saw a 0.4% increase in car insurance quotes and a 0.1% increase in home and pet insurance quotes.

While this doesn’t sound like a resounding success, it was actually a great result.

Remember that this exercise has two purposes. The first is to check that the site is functionally sound.

The second is to guard against any significant drop in your primary metrics (such as sales conversion). Be aware that overall sales conversion may well drop at first, as customers get used to the new design and layout.

Your A/B testing tool and/or analytics tool should also be able to show you how the site is performing by segment. So if you wanted to view the performance metrics for all mobile visitors, for example, you can.

What are the success criteria?

There should be an underlying measurement framework that underpins the objectives of your website, whether the existing one or the new one. This framework is made up of the KPIs, or Key Performance Indicators, that give you a clear idea of how well you are doing.

When you are testing the success of the old site against the new, it’s important that you don’t have too many.

I learnt the hard way.

A few years ago I worked on a website redesign project where we were asked to report on all metrics within our measurement framework. The idea was to provide as detailed a report as possible of how we were tracking.

It quickly became an impossible exercise, though, as subtle differences in site design meant that in a number of instances we weren’t able to compare like with like. And the results were meaningless.

Focus on the primary metrics you usually report on. For an ecommerce business these might be:

  • Sales conversion rate
  • Newsletter subscription rate
  • Average order value

Pre-launch experiments

Once you are comfortable that the new website is performing at an acceptable level, it’s worth already having some initial experiments up your sleeve to test against the new control. This effectively means you will be running A/B/C tests, with:

– A being the old site

– B being the new site

– C being the new site including a slight variation (thoroughly researched, ideated, hypothesised & prioritised).

Isolate areas that are known to be high impact and high importance, and focus on them first. These are usually the pages that have the most traffic, such as product pages or the homepage.

Again, focus solely on one or two primary metrics, since differences in page design and the user journey will make it hard to compare more granular KPIs.

By this stage you should have a new website that is:

– Bug free and lightning quick to load.

– Performing at least as well as the old site from a primary metric perspective.

– Already enhanced by some quick and easy improvements.

You’ll then be in a position to switch off the old website and focus solely on the new. The sooner you can do this the better, as it removes the need for two separate sites and/or content management systems to be maintained.

Post-launch experiments

It’s a good idea to have a shortlist of A/B test candidates at the ready for launch.

These should be tests that you feel will make the biggest difference in terms of conversion uplift.

Most companies will have far more experiment ideas than time to run the experiments themselves – so it pays to be choosy, and to prioritise.

You should have a fairly good idea of initial test candidates already. Score them by ease of implementation and potential revenue uplift. (Or, if you want to be more thorough, use ConversionXL’s PXL prioritisation methodology).

Generally, the further down the sales funnel, the more valuable the test will be in terms of revenue.

So an experiment run on a product details page on an ecommerce website, for example, will usually be prioritised ahead of one on the homepage.

And one testing the checkout process will be of higher value than, and therefore prioritised above, one anywhere else on the site.

Often an effective way to estimate the potential value of an experiment is to take into account the data you already have relating to the profitability of your sales funnel.

If you know the value of each part of the purchase funnel in terms of average sales revenue, and are able to estimate the number of visitors who would be bucketed into the experiment based on analytics data, you could get a decent estimate of the possible uplift.

A good question to ask is: If we were to see a nominal conversion uplift of 1% from this experiment, what would that equate to in terms of additional revenue?

Baby steps

You may find it useful to follow these ‘rules of thumb’ when you launch your first experiments:

–          Consider just running one test at a time to start off with. You don’t want to have too many variables, and you want to clearly see which changes are having an impact.

–          You might want to ring-fence a small proportion of traffic for this experiment too (e.g. 10%). This helps keep risk to a minimum, although you need to be sure to send enough traffic to the variant to make the experiment worthwhile.

–          Think very carefully about which changes are going to result in the biggest upswing in conversions, and focus on them. You want to minimize wastage as much as you can – of time, cost and effort – so it’s critical that your approach is as lean as it can be.

–          Be very clear about how long you run your initial experiments for. If your website doesn’t get large volumes of traffic then you’ll need to make sure you run it for long enough to get a statistically significant, valid result.

Track and monitor success as you go

I mentioned above that you need to be selective with the metrics you use to track the success of your redesign project when you first launch your new website. You also need to do this on an ongoing basis, as you launch your initial A/B tests and incrementally improve the content and functionality.

However, depending on the type of tests you run, there will be different success metrics that you will need to measure.

It’s important to trace the objectives of your experiments back to the core business objectives of your website – and of your organization. If ancillary product penetration is a primary KPI for your company (as it was for us at Direct Line Group), then you need to make sure your experiments focus on this outcome.

It sounds obvious, but you would be amazed how often this gets missed.

My recommendation would be to use a test results document to log the results of each of the experiments you run.

This can be a simple spreadsheet that lays out, in chronological order, all the experiments you carry out in your testing program, with their results. It’s generally a good idea to include comments about the experiment, as well as agreed next steps.

It’s amazing how easy it is to forget these details down the track if they aren’t documented somewhere.

We would recommend that you include at least the following fields in the spreadsheet:

• Test ID

• Test description

• Success criteria (& maybe hypothesis)

• Experiment start and end dates

• Winning variant

• Comments, including uplift/downturn %

• Agreed next steps

This has served us as a great way to present results to stakeholders and the senior management team. However you might also want to consider some data visualization to support it, as the spreadsheet format can come across a little dry on its own.

Cultural benefits

As well as the clear performance benefits that a structured approach provides, there are clear cultural benefits too. Organisations that have adopted a test and learn culture have generally been more successful than those that haven’t, and including CRO strategies in your radical redesign can be a good way to introduce a culture of continuous improvement and experimentation.

Embedding a test and learn culture takes time, and needs to come from the top. You can build interest among senior managers by regularly reporting progress against benchmarks, and communicating the winners from your testing programme.

The C-suite respond to figures, so be sure to share tangible results.

Conclusion

Wherever possible we would advocate taking an existing website design and improving it iteratively. Push back on the urge to start from scratch, as performance will suffer as a result.

If you are left with no other option than a radical website redesign, though, do your best to devise a conversion plan that can be adopted as part of the overall project schedule. This will help you to align your redesign to business goals, giving you the best chance of success.

Have you redesigned your website recently? Let me know in the comments below how it went.