Don’t test everything with Oliver Palmer

AI-Generated Summary

Effective experimentation requires overcoming the urge to prove personal cleverness and embracing data-driven decisions. Shifting from traditional practices to hypothesis-driven methods is challenging but essential, requiring genuine commitment and engagement from all stakeholders. In the realm of Conversion Rate Optimization (CRO), avoiding unrealistic expectations is crucial. Rather than aiming for constant big wins, the focus should be on strategic experimentation guided by user research. The future of CRO involves mainstream adoption, akin to SEO, with optimization teams integrated into organizations. Despite talk of AI-driven optimization, the core of CRO progress remains rooted in consistent and impactful experimentation. For further insights, visit oliverparma.com.

Video

Audio

AI-Generated Transcript

Oliver Palmer 0:00
The mistake that I see people making over and over again, is to think that experimentation is a way that they can prove how clever they are. And I know this mistake, because I’ve been there, I’ve done it. I used to think I’ve got this great idea, and it’s gonna have a huge impact on conversion, and I can prove it, and it’s gonna be wonderful. And of course, it’s got no basis in reality. It’s just a thing that I thought of. And when you do that enough, you realize that your own head is a very bad filter, and it’s a bad place to come up with experimentation ideas.

Sheldon Adams 0:38
All right. Get this thing going all over. Welcome. excited to have you on here. Have a little conversation, chat and chat. And CRO. Before we get started, do some quick introductions. I’m Sheldon Adams. I am the head of growth here at NaVi. We’re a Shopify CRO agency, joined today by Oliver Palmer. All we want to kick things off and just tell us a little about yourself.

Oliver Palmer 1:05
Cool. Yeah, thanks, Jolin. So I am a long time CRO practitioner. I ran my first AB test in 2008. Back when Google website optimize, you know, it was the first tool that really allowed any idiot with a Google login to go in and like run button tests, essentially. And I was working in sort of various ecommerce roles, and just love the idea of it. You know, I think there’s just something about my personality, which means that probably like for a lot of us, the idea of using data and experimentation to inform decision making really, really resonated with me. And, you know, from that moment onwards, as I say, I made all the mistakes, ran all the bad tests, thought I had discovered absolutely brilliant insights I had not, but was always really enamored of, of ABX testing, and, you know, keen to get more into it. And then about 10 years, almost to the day today, I was living in London, and I got a call from a recruiter saying, Hey, you interested in being the first in house optimization manager and I think they had that title then at a Britain’s largest mobile telco. And I was like, Yeah, sounds great. And that was, you know, long before. There was any real kind of education around this stuff. You know, I think CSL was just sort of a blog and really exist there, there weren’t a lot of resources, everyone was making a lot of the same mistakes. And we just sort of had to feel our way through and, and I was really fortunate to work in that business. It was a was a telco that just had a merger of the French National telco and the German national telco, they were really focused on growing conversion, had a lot of issues with a very expensive sort of brand based website, had a really sort of, you know, supportive, you know, a lot of a lot of buy in from sort of senior people within the business. And you’ve worked there for a couple of years running experiments, and was intrigued to discover that the tough stuff was not the technical side of things. It’s the cultural side of things, it was helping, helping people to learn that it doesn’t matter if they don’t know, and helping to kind of instill that sense of humility in an organization. You know, so many so many people, particularly, sort of senior execs in big businesses, in my experience, have this view that they you know, they said a lot of store by their experience in the battle hardened wisdom. But one of the things you learn when you start running experiments is no one knows anything. And I’m still enough of an idiot to think that I know things but I don’t. I’m slowly learning that lesson. And so I went down this path of working with different organizations went to a big telco sorry, big took a big broadsheet newspaper in the UK called The Telegraph subsequently worked with their product and UX team did a lot of the same thing. spent many years working with Kmart, a Australia big, large Australian Department Store, fortunately, completely separate to the American one now, which is still teetering on the brink of bankruptcy. Do they still exist? Hey, Sheldon.

Sheldon Adams 4:19
They may exist somewhere. I think they’re basically out of business. Someone might have bought them out of bankruptcy. But

Oliver Palmer 4:26
I can’t eat these American like, sort of like zombie corporate structures, like businesses are allowed to go on in zombie mode for a long time in the US that they definitely are. But yeah, they sort of they started I think, as a sort of Australian offshoot, but had been separate for for many, many years now and very successful, interesting business and spent many years working with them sort of building up a culture of experimentation. And as I said, I’ve found that that’s really where lots of my work is these days is it’s not the technical stuff, because it’s not hard. The hard stuff is how do you choose and your ways of working to be more hypothesis focused? How do you learn to embrace that sort of uncertainty and to act on and, you know, correctly interpret the interpret the results and, and that’s what I spend a lot of my time doing these days.

Sheldon Adams 5:18
Thank you very, very tough transition for me personally, going from like that real small tinkering, like, non research based on, there’s nothing behind it the first like, couple of years, I was in the industry, to like, yeah, to now where it’s a much more, more nuanced, sophisticated, and, you know, mature approach this. So you sent over ahead of time here, an anecdote you spoke to like with a client, where you kind of, it’s almost like testing like, free up time, or like the stuff, figuring out what not to do. And I was hoping you could kind of go through and just talk through that for the listeners and speak to it, I think that’s a really cool way of using CRO experimentation that most people probably don’t even think of doing. So yeah, you want to dive in on that real quick?

Oliver Palmer 6:11
Sure. So you know, there are certain there’s certain websites, where it’s, it’s really hard to use traditional AV testing to increase conversion rates, it can be really, really tough. And, you know, often in my experience is sort of, there’s a couple of phases of experimentation. And the first phase, and the one that gets all of the attention is fixing broken stuff, you know, things that are obviously broken, you can you can find them, you can fix them. And that’s where you get those big wins that get people really excited. But maintaining momentum over the long term. And as you get more sophisticated, gets really tough. And you find that your your win rate, quote, unquote, goes down considerably. I should keep these stats to hand because I always forget them. But I know that I think running a hobby has compiled, you know, a lot of stats from you know, different, really mature organizations that are really great at experimentation, like the classics like booking.com. And, you know, he worked at Bing and Airbnb and whatnot. Google have reported different numbers, and everyone says, Look, most experiments don’t move the needle positively. Many, some move the needle negatively, in my view, that’s just as good. And many are flat. And people really struggle with flat experiments, they find them a bit deflating, you know, and and everyone talks about the huge wins. And when they don’t materialize, you know, in the long term, or they diminish people go, what are we doing, you know, are we wasting our time running all these experiments. And I had this realization a couple of years ago, working with a client where Yeah, we’d fixed all of the broken stuff. And they were also really uniquely positioned in the market, they had great products at a great price. And there weren’t a lot of other places we could go to get those products. And in those cases, those sites are really hard to optimize, you know, when people will walk over hot coals to get what you’re selling. You can make all of the sort of usability tweaks and you know, do whatever, but it kind of doesn’t really make a difference. And so we entered a patch with this client whereby we were just having flat test after flat test after flat test after flat test. And everyone was getting really deflated by, you know, we were testing interesting things, that things that were grounded in research, we were doing everything correctly. And we probably actually were making the site better, but it just wasn’t translating into commercial impact. And so this idea actually came from my client at the time, who said, Well hold on a minute, we spend all this time doing things that we don’t test that we think are going to move the needle that are going to have a you know, a big positive impact. What if we just experiment with not doing them and we can save time, and we can reallocate that time, two places where we know it’s going to have an impact. And this set us off this path of running all of these tests, we sort of itemized all of the things that the various teams did. So this is working with the this is with a sort of quote unquote, Online team and a big retailer, they had, you know, category managers merchandisers, all that sort of stuff, you know, people that are really working around presenting the best sorts of products on the website and trying to squeeze the most out of them. So merchandisers you know, they’ll often do things like making sure that you know, the product should appear right at the top of the category page, you know, the things that people are looking for, or the sort of BEST OFFERS or, you know, they’ll tweak search results to make them more relevant. You know, all sorts of things like that. have the kind of best practice and intuition says this is going to be useful, this is going to move the needle. And so the first thing that we did was we call it tall banners. Because what they did, they had on their category pages, they had completely without thinking for years, been working with their marketing team to commission and pay for these photos of people, using their products, wearing the clothes, you know, interacting with the products, whatever, like catalogs, and the idea had always been people resonate with seeing these photos, these like lifestyle images, and that’s a good thing it’ll be good for conversion was never really questioned. But it was really cumbersome to do, you know, it was it was expensive, they had to pay for the images. But more than that, they had to maintain them stock, the stock ran out very quickly or turned over very quickly. So they were always having to work with compliance, as well to, you know, have compliance on their back to say, hey, this product that’s on the page, we’re not selling that anymore, you have to remove it, they have to get a new photo. And we worked out that somebody was spending 40 hours a month just like changing these images on category pages, because they thought it would have an impact. And so we’re just tested not doing it, we just replaced it with some text or an image or just something really plain did that across the whole site. Lo and behold, it had no impact. They say 40 hours a week, sorry, 40 hours a month. And we’re able to parlay that into other other activities, which you know, do have an impact that they they knew had an impact, like particularly around emails, and so on. And we just sort of systematically work through all of these sorts of opportunities to make sure that everything that the team was doing, actually did move the needle. And it was a really, it was really eye opening for me that we could use experimentation in that way. And it’s really informed a lot of the experiments that I’ve run since

Sheldon Adams 11:57
that’s awesome. Like, I love hearing those atypical wins. Because I know like, I feel like 90% of conversation in the space is did a live conversion rate. And if it didn’t, it’s a failure. But yeah, like saving, I mean, you saved 100 hours a month over like a couple different things like that’s depending on how much they pay their people. That is a huge efficiency gain to say nothing over 100% savings. Yes, yeah, that’s just a really, really cool and unique, unique way of putting that. So that’s really glad you brought that one up and could could expand on that one. And that kind of leads into my next question here. Because again, you have a great breadth of experience here. And it’s kind of done, it’s in house with some very big teams that were very motivated and like bought in, and as well as being a kind of Freelancer in the consulting side. So be really curious, just to get your sense of the commonalities of like, what’s good clients, or good internal teams look like in terms of like the buy and the traits that they have? And how are they different from the bad ones? And then pre emptive follow up there? How quickly can you tell?

Rommil Santiago 13:18
This is Rommil Santiago from experiment nation, every week we share interviews with and conference sessions by our favorite conversion rate optimizers from around the world. So if you liked this video, smash that like button and consider subscribing, it helps us a bunch. Now back to the episode.

Oliver Palmer 13:33
So I mean, I think you can get, you can get a really good idea of you know, just how, you know, when I, when I assess an experimentation program, the first thing I ask for is like, let’s, let’s have a look at some of your documentation. And so, you know, very quickly, even just looking at the way that hypotheses are formed and metrics are selected, I think is a is a really, really dead giveaway as to, you know, a high performing experimentation team versus one that’s just going through the motions. In terms of your question around, you know, what, what separates good teams and bad teams, I think the essential ingredient of a high performing experimentation program is executive buy in and it just it has to, it can’t just be even that sort of token, you know, we often see situations whereby the CEO and the CMO, like went to the Adobe Summit, and you know, watch George Clooney talk and do some experimentation things and you know, bought all the stuff and said, We’re doing this now and get really excited about it and you know, start demanding big results. Not they’re more really ensuring that all the right people care about experimentation and even habits sort of as being part of their role. One of their KPIs. One of the things that I found has been really important with with many of my clients has been whereby the people that are sort of, you know, tangentially related to the experimentation program, like, let’s say, you know, other UX designers or certain people in the product team or whatever, just have it written into their KPIs that they have to initiate X amount of experiments, you know, per month, or quarter or whatever. And so even if they’re not on board, they kind of have to get up to speed pretty quickly and, and want to seek it out and want to learn. And I think that that kind of, you know, having that organizational impetus is is invaluable compared to so many organizations that by the tools, launch a few bad tests, get discouraged, and you know, it sort of trickles trickles out. And they, they do nothing and say that didn’t work for us.

Sheldon Adams 15:48
Yeah, that makes total sense. I’ve definitely been there have, yes, like a senior level person going to summit and then like, Okay, we’re gonna check that box, we’re doing this. Yeah, it peters out pretty quickly, once there actually has to be subsequent investment or like a focus on it. So that’s, that’s really make sense and deploy helpful, dark, good,

Oliver Palmer 16:14
or just not those big results immediately. You know, it’s one of those things that vendors and agency alike have been guilty of a lot over the years is just talking about those huge conversion rate increases that they used to sell the tool to build the business case for the, for the program, or the cell in the agency. And that just creates this huge burden of expectation. And often programs crumble beneath that, you know, and one of the things that I’m often working with my clients to impress upon them is the the promise of experimentation, the fact that you can try multiple things simultaneously and see what works. That’s enough. You know, imagine if you could do that in life. Like it’s a magical proposition, as far as I’m concerned. So it’s just you don’t need that don’t need that hype.

Sheldon Adams 17:05
Yeah, no, absolutely. Funny. I actually saw a new, like, targeted ad for an AV testing platform today. And their whole thing was in less than 90 days, like more than doubling your conversion rate, just from the slight, speculative data and case studies. And I got, he’s like, yeah, it’s just feel like we’re littered with that still exists. Yeah, I mean, that’s it always. It just makes it harder, I think, to get the buy in or to understand what reality should be, like. A big what a big win actually looks like in

Oliver Palmer 17:45
Yeah, 100%. For a long time, I felt really sheepish about calling myself a CRO I still kind of do because there’s so many shady operators in the space. I think there’s there’s less now than they used to be. But you know, there’s been a lot of a lot of less than scrupulous practices over the years, as I’m sure you’ve seen, too.

Sheldon Adams 18:07
Oh, yeah. So I mean to that end, next question, I kind of want to steer into it. Like you said, like, there’s a lot of unscrupulous people that have ventured into this space. And some of it is just complete nonsense. Some of its good. Like, it’s, it’s really hard to say that seeing the results, but in kind, like the breadth of your time in the industry, are there any consistently consistently bad bits of advice that you’ve kind of seeing given or things you? When you hear it, you’re like, Man, I wish I could just kind of slap the person who says that, or?

Oliver Palmer 18:46
Oh, look, the thing that, you know, just to reiterate, like, I do think things are getting better, I think things are getting much more mature than they used to be. The thing that springs to mind, and I don’t know, to what extent I don’t see this as much as they used to anymore. But a lot of this, like, test everything mentality is very problematic. I think, you know, it’s you don’t become booking.com. Overnight, you know, you’re not running hundreds of experiments a week or month. For most organizations. Experimentation capability is finite. There’s only so many experiments you can run. And I think people need to be realistic about what they can do. And really to be very informed about how they choose those things. You know, the the mistake that I see people making over and over again, is to think that experimentation is a way that they can prove how clever they are. And I know this mistake because I’ve been there, I’ve done it. I used to think I’ve got this great idea and it’s going to have a huge impact on conversion and I can prove it and it’s going to be one of the thought and of course, it’s got a basis in reality, it’s just a thing that I thought of. And when you do that enough, you realize that your own head is a very bad filter. And it’s a bad place to come up with experimentation ideas. So the one thing I just say to people again and again, is, you know, be very judicious about what you experiment with. It don’t expect that you can test everything. think very clearly about what you experiment on. And the best place to find those concepts. As painful as it probably is to hear it is just user research, is talking to users running those qualitative research sessions. And once you get three, four or five people telling you the same thing, that’s a really great filter, you know, and I think that’s where you should be doing your discovery. Don’t be discovering using experimentation, discover in qual validate in quant is really my gospel.

Sheldon Adams 20:57
Totally agree. And I was totally that guy early on in my career that I would like, if it wasn’t like, Sheldon, I probably probably like the first year or so was like, No, I’m not doing surveys. I’m not doing user interviews, if it’s not like a heat map, or a funnel drop off, not worth testing. And now I’m almost like the exact opposite. So I, I definitely aligned there. Like I think that’s just asking people makes everyone look way smarter, and you just get better stuff. So I’m glad you said that for sure.

Oliver Palmer 21:32
And I think the point is everyone sort of has to go through that journey I oftentimes find when working with clients is that I need to like let them make those mistakes for themselves, you know, and see, see why that’s a bad idea. Because that’s the only way I think to sort of get that out of your system,

Sheldon Adams 21:50
that little bit of like maturation, once they get to the other side of it. Totally different ballgame. So I want to give you a chance here pivoted a little bit and a little bit of time to brag about yourself. I’m always just curious, in terms of, like, there’s a lot of like good results out there. Everyone has them. If you had to like put one and like you’re given like your TED talk, or your keynote, or whatever, and you just have to kind of like point to something like what’s, what’s the story you’re telling? Or what’s the result you’re really wanting to highlight there? And I guess how did you two?

Oliver Palmer 22:26
I mean, I feel like it’s almost not, it’s not worth talking about the you know, he’s we’ve sort of touched on before, like experimentation is, it’s like playing the lottery, oftentimes, you know, there are things that you can do to increase your odds, like, you know, grounding your experiments in research. But those experiments which get, you know, double digit conversion lift, they happen, like every couple of years, I’ve had a few. But they’re really rare. And you know, everyone talks about them. And it makes people think that they’re a lot more common than they are. So I’m almost almost like to sort of shy away from those. But I think there’s one that you know, I can think of one. There’s one example that springs to mind. And it’s experiment that it’s an experiment that I’ve spoken about a lot, because I think it’s I think there’s a lesson in it. So yeah, many years ago, when I was working for either the telco in the UK, one of the things I noticed when sort of doing my first initial examination of their website was that the funnel wasn’t tracked correctly, there was a step that was missing as people went from adding a product, adding a phone handset and a plan into their cart to going into the, you know, billing and delivery section. And that was where they added insurance. And I noticed that there was an enormous drop off at this stage. Huge I can’t remember what the actual figure was, but it was, it was, you know, I’ve completely hallucinated that story. There wasn’t a drop off at that stage. There was no better too long ago was there was nobody adding insurance at that stage. And that’s why it had sort of been omitted because it wasn’t part of the funnel. And I just asked around that I spoke to some of the people in stores made to the lots of retail stores everywhere and found out you know, what, how much insurance you guys selling, and they were selling an awful lot more than than we were selling online. And I watched how they sold insurance and they said would you like to buy insurance fairly simple, you know, something like 30% of people opted in and online it was like, you know, point 1% or something and I went looked at the funnel and I saw that the step to add insurance was it wasn’t called insurance it was called clone phone. And this is the thing in selling a commodity product like like you know, telephony think everyone’s selling the same thing, essentially. So brand marketing’s really dominant, and they want to add their own spin on it. So they call it lots of fancy things. And I know if I look at Telstra, the big mobile telco here in Australia, they call there’s their insurance thing is called like stay connected plus or something silly like that. Because they bundle up all these other services to make it you know, less comparable to other telcos, it’s a clone phone was insurance plus, you know, cloud backup. So if you lost, your phone gets stolen, whatever, they give you a new one, plus, it’s got all of your stuff on it. Problem was, no one had any idea what this was. And so I ran some tests using what users do, which I think doesn’t exist anymore, but it was a British remote usability testing site. And I put five people through it and said, you know, find your handset and add insurance. And I think no one maybe one person, but basically, no one was able to find insurance. And there was one like an elderly northern man who spent, I think, an hour trying to add insurance. He was like Googling, he was searching on the forums, he was doing everything, he was really determined. I think he thought he wasn’t gonna get paid if he didn’t succeed, and just and struggled. And so we ran one of those famous experiments where we changed two words, clone phone into the word insurance, and it generated millions and millions and millions of pounds of revenue. The kicker is brand marketing still didn’t go for it. But that’s the perils of working in a large organization.

Sheldon Adams 26:47
Oh, yeah, I, I believe it, I’ve lived that one. You know, sometimes you can’t win that battle. But that’s a really cool story, though. Like I love, especially when you like even go into the kind of combining brick and mortar and E commerce experiences. And like, actually, I mean, literally like seeing how people are buying the product, and then doing your best to replicate that online. Like, it’s such a simple thing, that absolutely no windows, or mean, a lot of places don’t even have brick and mortar anymore. But still, like just knowing how people think about it is shockingly rare in the industry.

Oliver Palmer 27:25
Yeah, I was really inspired by the guy that conversion rate experts, you know, they’ve got that great book, making websites, when which I think is just a distillation of so much of their wisdom, and I was reading their blog really avidly back in those days, you’ve got on your shelf behind you. Yeah, I got my copy here, too. You know, back then, you know, 10 years ago, that was one of really the places they blog was one of the great places to learn about CRO and they’ve always been real advocates for doing research. And I can’t think of the term but they there’s a there’s a Japanese term, I think that comes out of the Toyota system, which translates to go to where the thing is happening. You know, if a machine breaks down on the assembly line, don’t talk about it in a meeting room upstairs, go down to the production line and talk to the workers and find out what’s happening. And then you’ll you’ll fix the thing. And so I was very inspired by that sort of research that they’ve done. And you know, those guys, they’re almost like method actors, they go way beyond normal usability research, you know, they talk about when optimizing for a weight loss program, they like join the weight loss program, when they’re working on a dating app, they start dating, you know, they find the single guy in the office and they make him date in the app. They are very diligent in the sort of research they do. And I think a really good model for all the rest of us.

Sheldon Adams 28:49
That’s actually one thing. I’ve I mean, we don’t do this on all clients, but pretty much anything that is feasible. I try to make a point to buy the product go through, like the shopping experience, the customer experience, like what’s the onboarding? What questions do I have? And it’s, without fail, like exposes some pretty big gaps of like, basically, what a like the brand, and like their leadership thinks is happening, things people understand, and what is actually happening. And that that real estate between those two is a lot of gold. So that

Oliver Palmer 29:25
100% I mean, as the saying goes, like, you can’t read the label, if you’re inside the jar, I think, you know, as consultants, as practitioners as people that are external to a client’s business, you know, we bring such great benefit just by having fresh perspective by being outside of the business, not having the you know, what have they call it the curse of knowledge.

Sheldon Adams 29:48
All right. So I want to ask you one more question here, kind of speaking, what you mentioned earlier, about the way things are progressing and seemingly, in many ways improve moving. So I guess, to that end, like three to five years from now, like, how do you see like the next iteration of CRO and experimentation? And what’s what’s different if we’re having this conversation in 2028? Hmm.

Oliver Palmer 30:17
I mean, if I think back is to, you know, what’s happened in the last 10 years. So the big thing that’s happened, I think, and particularly witnessing a lot of that from Australia, which has really lagged behind the rest of the world, in terms of maturity with experimentation, is that it’s become a very established practice in many major organizations. You know, 10 years ago, almost no one certainly in Australia had an embedded optimization team. Now, most large enterprise organizations do, and I think we’re gonna see more of that sort of mainstreaming of the of the process, you know, it’s, and we can look to SEO, I think, as a, you know, as a parallel, it used to be that SEO was absolutely the preserve of, of agencies and consultants, but now it’s just a, it’s a very normal and important in house function, you know, that will have a, you know, align into the CMO, and so on. So I think more of that sort of mainstreaming, we’re going to see, beyond that, I don’t expect to see anything truly radical, I’ve, I’ve done a good job at ignoring anyone that talks about AI and optimization, it’s just seems like bullshit to me. I don’t see any value in it. I really think that it’s, it’s, as I say, like the promise of experimentation, being able to try different things simultaneously to be able to validate your approach and your strategy is brilliant enough without, you know, plugging in the Deep Mind.

Sheldon Adams 31:57
I think we’re on the same wavelength there. Okay, it feels like a consistent steady march versus something that’s truly going to be, you know, completely revolutionary. And they both look really silly, a few years, but that is longer than I have. So that I have no doubt. But now you’ve you’ve been at this longer than I have. So I think I definitely subscribe to, to your theory there. So Oliver wanted to wrap just by given I guess, a chance for you anything you want to call out or point to where people can go find more about you have to still listen and yeah, just wrap up with any closing thoughts you had.

Oliver Palmer 32:37
Yeah, hopefully there’s one or two still listening. Yeah, if anyone wants to connect with me, obviously LinkedIn is a good place. I blog over at Oliver palmer.com and have a mailing list. We send out interesting stuff periodically as well. But yeah, all of apartment.com is a great place to go

Sheldon Adams 32:54
well over really appreciate the time here, glad we can make timezones work. And yeah, it’s been great chatting with you, and hope everyone enjoyed it.

Oliver Palmer 33:03
Wonderful. Thanks for having me, Sheldon.

If you liked this post, signup for Experiment Nation’s newsletter to receive more great interviews like this, memes, editorials, and conference sessions in your inbox: https://bit.ly/3HOKCTK


Connect with Experimenters from around the world

We’ll highlight our latest members throughout our site, shout them out on LinkedIn, and for those who are interested, include them in an upcoming profile feature on our site.

Rommil Santiago