A Conversion Conversation with UserConversion’s David Mannheim
During a recent chat with David, the hands-on founder of a company that specializes in conversion rate optimization, we covered a lot of ground ranging from his perspective on how to build a kick-ass optimization team, why personalization hasn’t taken off yet, some of the misconceptions around optimization.
Rommil: Hi David — thanks for chatting with me today! How has the new year been treating you thus far?
David: I have to say, the 1 day into the new decade has been outstanding! Lots of optimism for what 2020 has to bring and looking forward to hitting the ground running.
Could you share with us what you do over at UserConversion?
I’m the founder at User Conversion and my day to day position within my business is primarily as an optimization consultant. Practice what you preach and all that — but I’m best placed as a man on the ground in the trenches, less so the leader of an organization. So I support and help orchestrate a select number of clients such as Sports Direct, Mamas and Papas and Bravissimo to continually improve their online experience through optimization, experimentation and personalization.
“We always have operated in a pod-based structure, collaboratively bringing together a series of specialist skill sets that best represent optimization for the client; usually in the form of UX research, UX design, engineering, and a web analyst, circulating around an optimization consultant.”
There is a growing number of CRO agencies springing up these days — what makes User Conversion stand out?
Aren’t there just! There are a varying number of agencies appearing on the optimization scene which highlights its importance on the digital scene. We are a specialized agency, not an integrated one. We have a firm belief that when you focus on something, such as optimization, you are inherently better at it.
That aside, we built the business on a belief that no-one person should own or can be spread across the wide variety of skill sets that conversion rate optimization demands. We always have operated in a pod-based structure, collaboratively bringing together a series of specialist skill sets that best represent optimization for the client; usually in the form of UX research, UX design, engineering, and a web analyst, circulating around an optimization consultant. The way we’re set up is inherently based on the belief that optimization is not a specialism, but a series of specialisms.
Our experience, too, is one that is primarily focussed on e-commerce based optimization — because of our broad exposure to clients such as Travis Perkins, Avon and Papa Johns, we have the ability to advise on re-platforming, behavioural software, e-commerce ops etc. This is demonstrated in our hiring policy, too, hiring those with varied e-commerce experience.
There are a few other championing standout approaches, too, but we haven’t got very long together. Our objective-based approach, designed in sprints, circulating around collaborative PODs, all with a Northern (UK), humble and pragmatic attitude.
“Optimization should be a concept that is widely adopted and understood, not one that is seen as a project; those that see it as the latter, fail. Period.”
As you select clients to work with — what are some of the flags that you watch out for that tell you that a client may not be ready to work with you?
There are quite a few golden rules to look out for; some immediate red flags and others are more deep-rooted and therefore harder to spot.
The communication, education and maturity of optimization as a concept is the most important.
We need to ensure that we’re speaking, not just with the ‘right’ people, but all those involved and interested within the process as a whole. Just because a stakeholder isn’t involved, doesn’t mean they’re not interested, and visa-versa. The more eyes you have on you and the wider your reach is, the better you’re able to educate. As a result, prospective clients who invert that structure eg. communicating with just the CEO in smaller organizations, or a specific head of department in a larger organization tend to be those that we avoid. Optimization should be a concept that is widely adopted and understood, not one that is seen as a project; those that see it as the latter, fail. Period.
We do a number of things to address this like stakeholder identification surveys, hold a series of stakeholder interviews, workshops with various stakeholders etc. These all help in the understanding of current levels of maturity of optimization and user behaviour, as well as helping to understand what the actual problems are or why you are hired. In particular to the latter, there are various answers to look out for. Baseline conversion rate improvements, being a white knight for revenue increase etc are just some things to look out for.
Truly understanding what an effective conversion rate optimization process can bring, and what it is not intended to do, creates the best relationships and therefore results.
You have a multidisciplinary team, how do you fit experimentation in with UX research and analytics?
Experimentation is naturally a combination of proving or disproving hypothesis identified by user behaviour; how users behave and why is the cornerstone of testing.
The “how” (or what) is identified through either quantitative or qualitative means. What are users doing? How do we know it’s an insight or story to evolve? Can it be quantified? Does it have a sample size that deems this behaviour as a priority or, indeed, worthy of testing? These are types of questions that help us understand whether a piece of behaviour is of interest to experiment with.
The “why” helps us understand that behaviour more. Is there specific user anxiety behind that behaviour? Is there a core motivation behind that behaviour? This will help us create solutions easier and more targeted to that specific user problem, therefore enabling us to create a hypothesis or test.
Ultimately, experimentation is just another form of insight or data collection. It’s designed to learn more. So analytics (what happened) and research (why did it happen) feeds into a hypothesis for testing, but an AB test is a practical means of what and why — if analyzed effectively.
“…analytics (what happened) and research (why did it happen) feeds into hypothesis for testing…”
As more and more companies start to build out Experimentation practices in-house, what advice do you have for them in terms of building a great optimization team?
Similarly to how we’ve built our teams, to reiterate, optimization is not a specialism, but a series of specialisms. It’s a concept, not a department. Therefore, it should percolate through every area of the business; theoretically speaking. It is interesting how teams are set up — the structure, the culture, the tech, the skill sets, the strategy — and ultimately it comes down to the maturity of the organization.
It is, therefore, completely contextual.
Connect with members of the Experiment Nation Directory
|Photo||Name||Location||Short Bio / Specialities||LinkedIn URL|
|Kevin Anderson||the Netherlands||Product Manager Experimentation||https://www.linkedin.com/in/kevinanderson/|
|Gurudev Karanth||San Francisco Bay Area||One of the founding members of the experimentation team at eBay in 2005, helping build the digital experimentation platform ground up and then deployed at Paypal. Built the first omni-channel experimentation platform ground up at Target||https://www.linkedin.com/in/gurudev/|
|DIVYENDU MISHRA||INDIA||I have 4+ years of experience in the CRO, marketing and e-commerce enterprise domains. This was honed by the startups that I was part of and the 20+ US/UK based e-commerce companies/clients with whom I was working closely as a consultant. Apart from the e-commerce industry, I have also worked on CRO for BFS and the hospitality industry. I was responsible for their digital a/b testing roadmap, marketing strategies, driving growth and revenues through data, analytics & various tools along with client consultation and engagement.||https://www.linkedin.com/in/divyendumishra/|
Personalization is on top of a lot of marketer’s minds — but very few can succinctly define it. Could you share with us how you define personalization and why you feel it is something we should invest in?
I often talk about “why is personalization talked about, but not adopted?” The truth is it’s on everyone’s lips. Google it. 2014, 2015, 2016, 2017 .. etc, they were all classified as “the year of personalization” yet none have come to fruition. And, apart from tech companies where personalization at scale is a virtue within their business (Spotify, Netflix, Amazon etc), can you name a commerce business that is really doing it well?
Personalization, like optimization, is a concept, not a department. It’s complex, and not a plug and play process. Why? There are two reasons why personalization is not adopted effectively enough to mature it.
- Over-valuing. Many marketers over-value the impact of personalization or misinterpret the purpose of it. Many believe that other case studies and success stories can be easily transferred to your business — it’s contextual.
- Under-preparing. Just 8% of companies believe personalization is integrated into their tech stack. The technology, data and team (skills) all required for such a process (note: not a ‘project’) are contextually complex.
It’s because of this cognitive dissonance that Gartner stated that “80% of marketers Will Abandon Personalization Efforts by 2025”. Crazy.
As a result, my advice is that personalization, like optimization, is a concept. It needs defining and simplifying your business because it is contextual. How can you better personalize your user experiences?
“I often talk about ‘why is personalisation talked about, but not adopted?’”
You also teach CRO at CIM. As you’ve taught students over the years, what top 3 things continue to surprise them year after year?
Complexity. Contextuality. Creativity.
I don’t say those words just to get 3 x C’s (although really should trademark something like that) but it’s true.
I’ve taught companies from all over the UK on conversion rate optimization and even though the course is promoted as “advanced”, it attracts a range of companies from <10 people who are CEO and founder-owner, to 1000+ employees like Jaguar Land Rover or Hutchinson. Having that variety of people in one room, talking about the same subject, but trying to make it relevant is by far the most difficult challenge. An approach for company A of a particular size and maturity is not the same as company B with a different size, maturity, culture, skill set, tech stack etc.
Despite evangelizing simplicity, I always get a lot of surprised faces and comments about how complex optimization is. It’s a complex subject. Interestingly, I rarely speak about experimentation, too much, because testing is just a segment and part of a wider toolkit and approach. The complexity of what’s involved is always a surprise to all students.
Finally, creativity. Wow. People read too much. I really do believe that the internet, as much as it can accelerate one’s learning, it also hinders it. Fills it with ‘guides’ and ‘how-tos’ tactics of what to do in situation X or how to get the most out of approach Y. Going back to the first point; it’s contextual. Being creative, and removing yourself from a conditioned way of thinking, is a topic I’m really interested in. We assume we can’t solve this problem over here, because of the lack of real estate on the screen. Or that we should think in templates (PLP, PDP etc) because we’re used to that, not that we should think in user problems. I find the lack of creativity fascinating, and I personally blame the wealth of content available online; the good, bad and the ugly.
“People read too much. I really do believe that the internet, as much as it can accelerate ones learning, it also hinders it.”
It’s time for the Lightning Round! Frequentist or Bayesian?
What is your favourite optimization tools?
HotJar for tactical behaviour, Usabilla for wider attitudes and behaviour, AB Tasty for easy experimentation, Dynamic Yield for integrated personalization.
What tests do you wish people would run less often?
Tests based on usability (making things easier). Think about how to change or mould user behaviour, not necessarily how to make the behaviour easier.
Definitely easier is never the end goal. Thank you very much for your time today – I hope the rest of 2020 is as outstanding for you as the first!
Connect with Experimenters from around the world
We’ll highlight our latest members throughout our site, shout them out on LinkedIn, and for those who are interested, include them in an upcoming profile feature on our site.