Skip to Main Content

The Softer Side of CRO: Avoiding People Problems

Posted on 6.30.2015

By Tim Ash, CEO of SiteTuners

Conversion rate optimization (CRO) is a diverse discipline. What we are really talking about is persuading human beings to act - a very complicated and nuanced task under the best of circumstances.

CRO is an unusually broad field that touches many areas: psychology, behavioral economics, neuromarketing, user experience, visual design, Web development, testing and statistics, direct-response copywriting, content management, system administration, quality assurance, analytics and reporting. The success of a CRO program, however, hinges on a company's ability to get diverse types of people to work effectively together. Here are some common "people problems" a Web professional will likely face on the CRO journey, and some helpful tips to avoid them.

Overall test project - Without someone who is specifically responsible for moving a project forward, it will fail. Typically the person who "won" the project is either the product manager in charge of the website section (or landing page) being tested, or the head of the optimization team. Project managers should try and let go of their need to control everything - remembering that the optimization team is made up of temporary assistants who are only there to help. Alternatively, the head of the optimization team needs to go out of their way to reassure the product manager that they are there to make them look good, and are not laying permanent claim to their real estate.

Identification of website problems - Often, the very people who created the site are asked to diagnose the problems with it. Coming up with ideas about what's broken should involve a wide array of people and techniques. This often includes user-experience specialists (conducting user research and observing people completing key tasks on a site), customer service reps (who deal with complaints and problems), Web analytics specialists (who can spot underperforming pages), sales people (who deal with prospect objections) and traffic acquisition managers (who decide how to message the upstream ads and content).

It is important to not only be inclusive, but also to encourage all of these people to contribute. Stakeholders should not reject or qualify the ideas at this stage, or many people will shut down and not contribute in the future.

Brainstorming of alternative content to test - Just like in the problem identification stage addressed above, it is important to give everyone a voice. This includes copywriters, visual designers, user-experience specialists and IT staff members who will support the implementation of any technical ideas. An important nuance at this stage is to make sure that the contributors understand that all ideas are welcome, but that they are being taken in an advisory capacity only. In other words, most of them will not survive into the final test plan. This helps to manage expectations upfront and minimizes future disappointment (and negativity that may follow).

Drafting of test plan - Optimization team members' roles up to this point should have been purely supportive - gathering ideas and facilitating contributions by others. Now the optimization team should meet and prioritize the most promising alternative content versions. This will involve considerations like the amount of steady traffic available, the complexity of implementation and other resource requirements, as well as the politics of the content and people involved. They need to make sure that they are proactively including the IT people at this stage to make sure that there are no surprises when it comes to implementation later. The optimization team also needs to make sure that the personal opinions of whever is in charge do not heavily sway the test content. Their ideas should be given due consideration, and not much more - optimizers need to insist on their editorial independence.

Reviewing and approving the test plan - Team members should not get too attached to their test plan, but at the same time, they need to fight for the fidelity of their ideas during the approval process. This is often the time that conservative branding and legal compliance people will enter the picture. Their main function and orientation is to minimize risk. In other words, their default position is to say "no" to most change.

Many optimizers will settle for the radically weaker test alternatives that can often emerge from such reviews. Therefore, optimizers have to decide which fights are worth fighting, and when to just scrap the test because the remaining ideas are too incremental and feeble.

Approving the budget and allocation of staff resources - To run a proper optimization program, a well-defined prioritization process for tests must be in place. An important part of this is to build the return on investment (ROI) case for each project, enlisting help to quantify the long-term value of a successful test, and understanding why it justifies the required resource investment. At the same time, optimization teams need to make sure that the expectation is properly set regarding the success of any individual test; there needs to be support for the notion that over time the testing program will produce results.

The A-Z Guide to Conversion Rate Optimization

Get 26 CRO tips at

Implementation - As optimizers implement their test ideas, it is critical that the test plan is followed faithfully. In other words, do not let a bunch of little deviations creep in during implementation by visual designers, Web developers and IT support staff. This can lead to "death by a thousand cuts," and undermine the clarity of test ideas.

Quality assurance - Brands should also make sure they have someone independent of the implementation actually view and validate all test versions for correctness on the appropriate devices. Also confirm that everything (including analytics and test tracking) works properly after the move from "staging" to the "live" environment. It is critical to make sure that the data in analytics and testing software agree. Having significantly different numbers may be a sign of incorrect tracking or test implementation, and optimizers want to find this out and restart the test as soon as possible. They should work with Web analytics to "watch the pot boil" until they are confident in the numbers coming through.

Data collection & analysis - Optimization teams should also review their data on a weekly basis. Regardless of what their testing software dashboard indicates, fight the urge of some people on the team to peek at the results early. Until a site has a large number of visitors, the data can be very misleading and put a team on an unnecessary emotional rollercoaster.

The softer, human side of CRO projects can be a source of great advantage if handled properly. Optimizers should start paying attention to the communications and relationships among team members - then they will truly be an optimizer in the full sense of the word.

Tim Ash is the CEO of SiteTuners, Chair of Conversion Conference and bestselling author of "Landing Page Optimization."

Today's Top Picks for Our Readers:
Recommended by Recommended by NetLine

Leave Your Comment

Login to Comment

Become a Member

Not already a part of our community?
Sign up to participate in the discussion. It's free and quick.

Sign Up


Leave a comment
    Load more comments
    New code