Podcasts with Experimenters
CR-No: Episode 1
The following is an auto-generated transcript of the podcast by https://otter.ai with very light manual editing. It’s mostly correct but listening to the actual podcast would be wildly more understandable.
Rommil Santiago 0:00
From Experiment Nation, I’m Rommil Santiago and this is CR-No – a series that pulls back the curtain on the conversion rate optimization industry. Listen in as a panel of experienced CRO veterans talk about some of the joy and a lot of the pains of our industry.
Welcome. On today’s episode of CR-No, on our panel, we have Shiva Manjunath from Gartner, Kenya Davis from Evolytics, Edie Aguilar from Circle Media Labs and Siobhan Solberg from Raze. Today, the panel shares its thoughts about quote unquote, best practices, and their general thoughts about how this industry is perceived, as well as explore how much they love testing button colors. I may or may not be kidding about that last one, you’ll just have to listen to find out. So with that said, I hope you enjoy the episode.
So hi, my name is Rommil. I’m from Experiment, I can’t even get the name right, Experiment Nation. And with me, I have a great group of CEOs who are going to talk about various various number of topics. Welcome, and yeah, let’s go around and introduce ourselves. Let’s start with Siobhan. I just wanted to pronounce your name.
Siobhan Solberg 1:30
So I’m Siobhan. I’m obsessed with optimizing pretty much everything in my life, from Ironman training to when is the best time to walk my dogs. So I started an optimization agency called Raze CRO. We focus mainly on optimizing customer journeys, for ecommerce, and Saas, using our expertise and measurement and data concepts in psychology and forecasting.
Rommil Santiago 1:54
Very cool, Iron Man, we might have to come back to that again later. That’s very interesting. Let’s go on to Eddie.
Eddie Aguilar 2:04
Hi, I’m Eddie. I’ve been optimizing for basically my whole life. And currently I’m trying to optimize my sleep habit with my newborns.
Shiva Manjunath 2:17
Might be some low traffic over there
Rommil Santiago 2:22
Sample size might be questionable. Welcome Eddie. And let’s move on to Kenya.
Kenya Davis 2:30
Hi I’m Kenya. I’m a senior manager of decision science at Evolytics and we take data to another level with conversion optimization. So very heavy on the stat side, really exciting when it comes to pairing that up with business process. That is my niche.
Rommil Santiago 2:51
Nice. And I think we’ve spoken before Shiva, but let’s let’s hear from you.
Shiva Manjunath 2:55
Sure, yeah, I’m Shiva. I also share in kind of everyone’s mentality of optimizing not only websites but optimizing their lives. I definitely have like an airtable. And not only for my tests, but I have an airtable for finding apartments and airtable for finding the best ways to walk my dog. And I’m just not only optimizing websites, but optimizing my life. And I’m now a program manager at Gartner.
Rommil Santiago 3:19
I’m seeing a very interesting pattern with everyone. Everyone has a spreadsheet or airtable or they’re optimizing things. I before we jump into the questions I’m wondering if your friends, family or significant others. kind of look at you funny as you try to optimize every single aspect of your life. Anyone?
Kenya Davis 3:42
Oh, dear God. We just bought a house. And I was like, Alright, here’s 30 years. Here’s our plan. Here’s how we get there. Here’s how we can move faster. If this happens in life, then here, go to this sheet. He’s like looking. Sorry, but what are we eating for dinner again?
Rommil Santiago 4:03
So how do you decide supper?
Kenya Davis 4:07
He has this way of flipping it. So I’ll say, you know, what do you want to eat? He’s like, what do you think that I want to eat? And I would say something like, cool, that’s what I was thinking. And it’s a way of flipping it. So that I decide. Because I am very indecisive with food.
Siobhan Solberg 4:24
Oh wow, I have a meal plan that I figure out on Sunday for the whole week. I shop for it. It’s all set. It’s like I have my training my meal plan my meal times all organized. And my partner is like can we just do something different today?
Rommil Santiago 4:41
Spontaneous group. Clearly,
Siobhan Solberg 4:43
I’m always spontaneous when it comes to travel. I will be gone tomorrow if there wasn’t COVID. I’m in Venice today. Then I’m then I’m spontaneous.
Rommil Santiago 5:00
All right. Okay, so let’s start off with a little bit of a softball. It’s a topic that I’ve spoken to Shiva about recently. It’s about best practices. And I wanted to hear what your thoughts of best practices were, are they valuable? Or are they not to CROs? Let’s start with Eddie.
Eddie Aguilar 5:23
Yeah, so my thought around best practices are, they’re more like, these are what kind of tests a lot of companies and a lot of optimizers have run already, many times before. And if you’re looking to get into optimization, these are certain locations that you can start with more than just like, oh, if you optimize for these specific areas, you’re going to have an immediate win. That’s not always the case. Because obviously, every website is different. And every business is different. But it’s just like I said, it’s more of just giving you a guide of where you can possibly start. So I don’t really take best practices like to the extreme. So it’s just to help you out to get started.
Rommil Santiago 6:15
Shiva, I’d love to hear your opinion on what Eddie said.
Shiva Manjunath 6:19
Yeah, I mean, I agree 100%, with what Eddie just said, I, I hate the term, because it’s just in the in the verbiage, it’s best practice. And if you take best practices, say, like, I don’t know, if it’s a best practice to, I don’t know why we keep on going to walk your dog, but it’s best practice to walk your dog at like, seven in the morning because of digestive tract, I don’t know. Like, that’s, there’s this bit of science in that. But that’s not always going to be applicable to your dog. And in the same way with websites, it’s not always going to apply to you. It may work, you might test and you might find stat sig, and it’s great. But it also may not actually work out for you. So come into what Eddie said, it’s, it’s great for prioritization, it’s great for you to look at it and say, all right, other people are having success with this, let’s try it out. But don’t just do it. Definitely test it. Kenya, I don’t know if you have any thoughts related to that, too.
Kenya Davis 7:08
Oh, gosh, you guys are gonna hate me, I am on the opposite side.
I feel like there’s best practices in different areas of optimization. So depending on I’m more of an order of operations person. So if you’re, you know, you have these big plans, and you’re like, I want to do a redesign, or I’m trying to boost this new product, or, you know, I’m learning about my customer. Within that there’s like these, these fundamental things that you should start with, and throughout the process, you know, it’s okay to get a little scrappy, but what I find is that if you start with best practices, but don’t actually apply them throughout the entire journey, you end up with this, like Frankenstein of an experience at the end that you you know, you’re not even sure what helped what or what hurt what. I don’t know if someone disagrees, but I’m definitely on the on the side of you know, each business should have its own best practices. You know, there’s like a general type of thing that we we all kind of follow. But I think before you’re even beginning, you know, do the research on which business or unit that you’re working with and and figure out what is a best practice for you.
Shiva Manjunath 8:27
And I think maybe there’s a distinction between like best practices regarding like program management, versus like best practices, where we’re saying, like someone’s saying, you should have a green CTA on your website? Are you referring more to like the project management piece are more towards like the UI of the website?
Kenya Davis 8:43
I would say less of the UI side, more of the, the statistics side and more of the order of operations.
Shiva Manjunath 8:51
Yeah, I completely agree with that. I think for my for my like, spiel, I’m talking more about like, people are talking about UI, like, Oh, you should have the CTA on the right hand side of the page. And you should always have a CTA. That’s like, less than 20 characters, stuff like that, where I think it’s more like on the UI front of like, design, just taking these best practices and applying them on your website isn’t always going to yield success.
Siobhan Solberg 9:18
Exactly. No. And I also feel like, you know, what, kind of saying, like, there needs to be best practices, implementation, how you’re running a test. All of these obviously are required. I don’t think anyone’s gonna argue that, but I also feel that best practices, yes, they’re guidelines, but the truth of the matter is, a lot of best practices are based on research. If it is psychology, statistics, whatever it might be, and if it’s based on research, then you can look at the research behind the best practice or what triggered that initially, and see if it’s applicable to your business. So you know, when we’ve done eye tracking studies, we know that usually people We’ll look at a site a certain way. And so this is part of the reason people say best practices are that you put your, you know, your value proposition on the top left side of the page. And that is something though that’s backed into some research that was done. So then I feel like that’s more acceptable to start from. But obviously, something like, oh, the button should be orange. I’m not so sure that’s research backed. So I wouldn’t go with that one. I know that I tell my clients very clearly from the beginning, it’s something I put in every proposal. But if you want to implement best practices, I’m not your person. Because that’s exactly how I am like, I will look at the research, I will determine what the business needs. And then if a best practice is applicable to them, right. Otherwise, forget it before beginner starting up people who are bootstrapping and don’t have a CRM team. This is sometimes all they have. And it’s a great guideline, I think.
Rommil Santiago 11:00
Then maybe it’s the name best practice, it almost feels like it shouldn’t be called Best practice, or like, maybe it should be called common practice, where it’s stuff that people have done and do commonly. But, but it shouldn’t to be framed as this is what you should do.
Shiva Manjunath 11:16
I agree, that’s I just hate the verbiage of best like it implies something. And then when you consider like, there’s a lot of best practices with like, project management, program management, there’s other things where those are truly best practices. And those are things that have been proven time and time again to work. And you can call those best practices. But I think a lot of times people will call things like UI tweaks, best practices, I just think it’s a verbiage thing that really annoys me.
Eddie Aguilar 11:44
Eddie here. It also depends on like, what’s your hypothesis in trying to run one of these best practices? Like, why are you running one of these best practices? You know, air quoting, but why are you running this best practices? Like, is it really going to affect you? Like, there’s just reasons behind it? And I personally think that it sometimes doesn’t end up going the way you thought it would, especially if you’re starting because the hypothesis just wasn’t sound like if, like Shiva said earlier, you don’t want to just run a test for a button color. That’s, you know, you don’t know if that’s gonna affect you or not. So.
Siobhan Solberg 12:29
I love what Eddie’s saying there because it does. He’s essentially saying like, if your hypotheses doesn’t call for that kind of a test? And isn’t that what it’s all about anyway? Isn’t that best practice to have a sound hypotheses? And then build your test off of that?
Shiva Manjunath 12:48
Yeah, I think I think that’s…. Sorry, I was just gonna say, like hypotheses, it’s just so important to CRO. And it’s not like, I have an idea, I think it’ll look good. If we do a green button, or I think it’ll look good. If we have, like, I keep on going back to button colors will like an orange button. Like, it’s not, is it? That’s a test, but you have to take five steps back and start with the hypothesis start about I think, x because why. And I think that’s where a lot of people kind of missed the boat on zero, they just assume it’s like, you know, I think this will look good. I think this will work well, why, like, challenge it. And that’ll help you get to a solid test, solid execution and solid test to learn.
Rommil Santiago 13:30
It sounds like the, there’s best practices in terms of the process, but not the idea, or the hypothesis.
Kenya Davis 13:36
Right? I was thinking of that, too. There’s best practice with, with our, our methods of getting to an answer, that probably shouldn’t shake as much. But then there’s your UI baseline, you know, your business baseline, and what are you doing and what’s in that field, an industry, what is a acceptable experience, and that is what you’re testing against, like, you are a different business, you have a very different objective and purpose for your customer. So your baseline is always going to change. But the practice in getting there won’t, you know, you’re going to have a hypothesis, you’re going to have your KPIs, but those will change over time.
Shiva Manjunath 14:21
And they change on a test basis to right, it’s not just thinking about, like, I’m going to run this thing, and this is what I hope to happen. It’s not always conversion rate. There’s tests that you run, that you’re not looking to get conversions, you’re just looking to have people spend time on the site and come back and there’s other things you don’t care as much about conversion rate, maybe your focus more on AOV. So, like, it not only changes on a business perspective, but even primary KPIs always change on a test basis too.
Kenya Davis 14:46
I think that’s about business, or just testing in general is that, you know, units get really fixated on the KPIs that they believe define success, and it can, it can define them for years. And I think the reevaluation of that definition of success in what they do and, and the behaviors is the part that, you know, we probably should start testing against, I think it’s very topical to consider how much COVID has shaken up every single business and how they’ve had to look at their, their data and the patterns of people. So, you know, then again, it goes back to that baseline changing. And you’re almost starting at square one.
Siobhan Solberg 15:30
Yeah, I mean, I think COVID especially, for example, brought that to the forefront where everything that everyone has known, let’s say, you know, for what has been considered a best practice doesn’t necessarily work anymore. The journey is not the same. The situation isn’t the same level of awareness isn’t the same. There’s so much that is changing. And like Kenya was saying, the businesses requirements are also different. And the businesses, what the business can provide is also changing. So I think that something like COVID has really brought to the forefront that people can’t just go copy competitors, or work for best practices, they really need to start evaluating what they can provide and what the customer needs at that given time.
Shiva Manjunath 16:21
Yeah, let’s just do what Amazon’s doing right. Amazon’s successful, let’s just copy them.
Rommil Santiago 16:29
From what I’m hearing, everyone agrees that we should all be just color testing, right? Is that? Is that what everyone’s saying?
Shiva Manjunath 16:35
That’s a best practice right there.
Rommil Santiago 16:41
I’m obviously just joking. And sorry, you don’t know me – I’m very sarcastic. So yeah, so a lot of folks actually think that CRM when you hear conversion rate optimization, I think the the way it’s described is testing button colors, or CTAs. rearranging locations of CTAs. And and that’s kind of, I’m not gonna mince my words, I’m gonna say that’s a that’s a misunderstanding. And to put it lightly, how would you describe it? Like, let’s say, someone who’s kind of unfamiliar to this industry? How would you describe it? And how, what can we do as an industry to change this misconception? I’d like to hear from Eddie.
Eddie Aguilar 17:28
So I’ve described my position and my role at companies, to my friends who are not very tech literate, or they’re just not like into tech like I am. And they typically just are amazed that what I’m doing, and then they start realizing that remarketing, and those type of marketing tactics are, are what is listening to them. And we start, and I started explaining to them that I’m really there to provide them the best experience possible to be able to find what they’re looking for. And that’s like the most simplest way I can put it for them without getting really technical. And I obviously start explaining how it is, it’s a process of science, you are starting with your hypothesis, well, you’re actually starting with the problem. And then you start coming up with a hypothesis on how to solve for that problem. And they just understand afterwards, they understand what I’m doing what I’m trying to accomplish and how I accomplish it.
Shiva Manjunath 18:46
I kind of want to, I agree with everything you’re saying Eddie, but on the first point you said that you start with a problem, almost, I think you start with research. And you start by looking at what is your landscape of data, what’s happening with your users. And that’s that’s the research, you start with that to identify problems on your site. And then you use experimentation to help try and not only solve for those problems, potential problems, but you’re also using the experimentation to learn more about your users. And I think a lot of times people just assume it’s, let’s move this here and improve conversion rate. I mean, it’s in the name, right conversion rate optimization, where I think probably a better explanation is like experimentation manager or something like that. But you’re using CRO as a tool to help you not only find ways to create a better experience, but learn more to then create a better experience. And like this, I say this all the time. experimentation should be used to test to learn not test to win. When you’re using experimentation to learn about your users. And once you get enough learnings, you’ll be able to make decisions to win.
Siobhan Solberg 19:50
Yeah, I think, um, so I’m one of the people that don’t actually think everything needs to be tested, but I do think everything needs to be researched. So we’re on the same page there. But I know that like, I get a lot of people who come to me and you know, as an agency, you get clients coming in reaching out all the time, and, or potential clients and saying, Oh, I just want someone to help me find the best button color, for example. And it is really a matter of just education at that point. I really, and I really take the time to do this, I take the time to educate people who talked to me or who reached out to me for work, to let them understand what my goal is, and ultimately what their goal should be for this program. And I think that that’s really helped me get forward. I also don’t use the word conversion rate optimization. I agree with Shiva, that it’s just a little bit. It’s a misnomer. I tend to just say, optimize journeys, or customer journeys. And just using those words, I think for a lot of people, it’s already a lot easier to understand what I’m doing. And, um, you know, if I have to use an elevator pitch to somebody, I would just say, you know, when you click on that, and you go to the site, and you didn’t see at all what that ad said, I make that better. And then they’re like, oh, okay, so you’re going to actually show me what I clicked on? Yup. So essentially, that’s what it is, right? I mean, it’s simplifying it. But it’s this idea of CRO is, it’s a big hype right now. It’s also somewhat a newer field, not new new, but new enough that people don’t really know what to expect. And I think a lot of my job as an optimizer is also educating people on what to expect, because there are a lot of CROs out there, or people who call themselves CROs, who really just do that – they do the best practices, the buttons, they, they make these promises that can’t necessarily be met. And I think it’s our job to define what CRO really is, or what optimization is to our potential clients, to our friends to whoever wants to listen.
Shiva Manjunath 22:05
They’re ruining it for the rest of us.
Rommil Santiago 22:11
Kenya, I’d like to hear your perspective.
Kenya Davis 22:15
Yeah, I think the misconceptions have come from what practices have been done by, you know, companies, up until now, you know, data, the data world in the marketing world, the joining of that is, to me is fairly new. They’ve always existed, but they haven’t coexisted. They haven’t relied on one another to come up with these ideas to solve these problems. So the easiest way of digesting what we do is to say that we change button colors. When I, you know, talk to my dad, who is a vet and worked on, you know, he was an aircraft mechanic. And did you know, to him in his world, everything is very tangible, mechanical and like, order of operations is very straightforward. So when I describe what I do, I find it very frightening because I tell him that I manipulate your experience to get you to do so that I can know what your what’s your, I can predict what you’re going to do. And he’s like, are you like a secret agent? Or even CIA type of thing? Like, no, I just use the information that is provided to understand like, Why do you do the things that you do? And how can I give you a better experience? And if that is through a different color or schema like, Sure, that’s that’s one thing. But there’s so much more that comes in, I think it’s just a matter of how ready a unit is to really go that deep. Because it’s, you know, it’s, there’s a surface level, and then there’s a deeper layer as to what is a good experience.
Rommil Santiago 23:56
Let’s wrap this up with, with something from Eddie, I think you raised your hand.
Eddie Aguilar 24:00
Yeah, so I wanted to touch on something Siobhan said. And it was around just the term conversion rate optimization. And I’ve seen a lot of people where they’re optimizing just for conversion rate. And that’s always like some, and Shiva said it earlier, it’s not always the case, you’re not always optimizing for conversion rate. There are companies that prefer revenue over conversion rate, or they prefer transactions over conversion rate. While conversion rate is probably like a secondary KPI in these types of experiments, it’s, it’s one of those terms that it just doesn’t really capture the whole essence of what, what we actually do as optimizers.
Rommil Santiago 24:51
There was actually something that Shiva said, my head was kind of noodling about it, where he said that the problem comes after the research. Now, I, I’ve been kind of debating this for a while. And I think that sometimes it’s the opposite. And that’s weird, right? But I at least in the roles that I’ve had, it’s sometimes, it’s a situation where sales are down or you know, something like that. And the problem is handed to you. And then you start doing a heck of a lot of research to come up with a more refined problem. So I think, either way, it kind of works. But at the end of the day, you are doing research to identify strong problems.
Shiva Manjunath 25:34
I think it’s, it’s that’s a good call out, I guess I didn’t mean like, it’s exactly linear. It could be use research to find a problem, to find the research to find the problem. Like it’s not, it’s not, one is the first and one is the second. Yeah, oftentimes, it’s both at the same time.
So it’s like you’re trying to refine it.
Eddie Aguilar 25:56
I always think you’re just researching all the time. Like, it’s, like, it’s part of like, the whole process you’re researching from beginning to end, even when the experiment is done. Life, I always find myself like, still questioning, okay. Why, you know, like, why are they doing this? And like, if an experiment that I ran wasn’t a winner, per se, it didn’t come out the way I wanted it to, I’m still researching in the end, you know, why is that happening? or How did it happen? So I just personally think like, research is a part of the whole process. So it’s always there, no matter what.
Shiva Manjunath 26:39
Yeah. And I think to that point, like it, that’s why I say like test to learn that test to win. Because if you have to take steps back and frame all your hypotheses in such a way that you’re learning within every experiment, like you’re not just testing, like, what’s the value of learning with a CTA? That’s green, right? pretty minimal? versus like, what if you put a branded video above the fold, and you see that 0.2% of people are clicking on the video. Well, that’s, that’s something that you could take away. So it’s like positioning your hypotheses in such a way that even if you hit a test loser, so to speak, it doesn’t win in terms of conversion rate, engagement, whatever, you still learn something that you can iterate off of for the next version of the test.
Siobhan Solberg 27:21
Yeah, but I think it also comes to the point where, I mean, I think, you know, Shiva, you’re right, like we all learn from a test, or hopefully we all learn from a test. But at the same time, it’s this chicken and egg thing, right? The problem is already there, we’re doing the research to discover what the problems are. And ideally, in an ideal world, we’re doing research, and we realize there is no problem that the customer is completely happy on the site. That’s obviously not the case. But I’ve seen too often that people make problems. Like, you know, I’m very data driven, I really, I like, I need to verify things with it within the data. I’m not someone who just based on heuristics, or whatever. And I have seen too many people falsely show data to make a problem. And this is where and this is where I think the problem, the focus on the problem is too much. And this is what I always try to tell my clients, I say, I’m only doing the research to learn how your users are acting, and then how can I improve their experience? I try to take this word problem out of it. Because the moment I say you’ve got problems, number one, they get defensive. Number two, is, if they don’t have a problem, why do I have to make one? And, you know, so yeah, we want to learn from everything we do from the research from the ideation from the development from, and then, you know, after the test, what did we learn from the test? Did the test show us any additional insights? But at the same time, I try not to focus on this idea of a problem. Unless there’s a bug, then I call it a problem.
Rommil Santiago 29:04
So that’s a wrap of our first episode of R-No. I hope you enjoyed the conversation and learned something new today. Some of the things I’m taking away from this are, well, best practices in CRO should only apply to approaches and workflows and not necessarily what you should test. To figure out what you should test, you should really have a strong hypothesis that supports it. The danger in copying others, quote unquote, best practices is that things change the environment, goals, everything. And the goal of zero is not to focus on best practices, but rather learning, learning about your users so that you can make better decisions. So with that said, from Experiment Nation, I’m Rommil. Until next time.
(Transcribed by https://otter.ai)
You may also like