Navigating Stakeholder Management and Gaining Buy-In with Shiva Manjunath

AI-Generated Summary

Media.Monk's Juliana Jackson shares the truth about app development, optimization, and how web and app should work together.

https://youtu.be/xTAJfvrJIMk

Audio

AI-Generated Transcript

Shiva Manjunath 0:00
I want to give you the data to help grow your business because this is your baby. And, like CRO has a tendency to prove people wrong and people don't like being proven wrong. Yep. But you have to approach it as like No, no, we're I'm doing this to make you look good. I'm trying to help you.

Rommil Santiago 0:24
My name is Rommil Santiago, and I'm the founder of experiment Nation. Today's episode we have Shiva Mandarina. Shiva is the host of from A to B grade zero podcast that you definitely should check out Shiva shares insights into how to manage stakeholders getting more strategic with the optimization program, and what he likes to eat for breakfast. We hope you enjoyed the episode.

Ana Kerkovic 0:45
So hello, everyone, and thank you for tuning in very happy to have Shiva Manjunath. Here you are the host of the podcast from A to B. And by the way, I love the title. Because for people who there's kind of the trick, it's playing on us, I think, because for people who are not familiar with the concept of AV testing, it might sound like the path from A to B is very short. Because it's not from A to Z. But for the people who are familiar with the concept. They know there's this giant pool of knowledge that goes into it. So I find that very interesting. Yeah,

Shiva Manjunath 1:30
I actually had to when I was looking up the names for that. There was a couple other like therapy podcasts that had this name. Okay. Right, like improve yourself and whatever. And I'm just like, I'm just gonna do, I'm just gonna make this my podcast name. And if I get copyright claimed, all know, I've made it.

Ana Kerkovic 1:52
Yeah, yeah, you'll you'll know you'll need it. But yeah, optimize yourself and optimize your website at the same time. Exactly. Speaking of optimizing, I know that you are an experimenter by vocation, and by choice. So you optimize everything from web pages to your morning coffee and your breakfast.

Shiva Manjunath 2:18
I'd say the choice is more in my inability to not handle or deal with my ADHD. So yes, I am an optimizer by choice of choosing to optimize rather than not, but it's like a personality thing. I actually I'm, like, I'm sure you feel the same way. It's inherent in us to just look at things and be like, I want it to be better. I, how do I make it better? Let's make it better. Like that's, that's inherent to all optimizers. Person?

Ana Kerkovic 2:50
Yes, yes. And experimenters. So so the first question I have, and this also ties into therapy, I mean, the whole concept of optimizing getting better and improving yourself and what pages but I'm really interested in knowing what does the optimized breakfast look like for you.

Shiva Manjunath 3:12
It's not good. But it's a routine that I could just keep on a daily basis. So every morning, I have a protein bar. And then I have my pre workout, go to the gym, which is basically the same one or two drinks that I have that has a bunch of caffeine and creatine, come home, usually a bagel with something on it. And then protein powder from fuel, which is like a meal replacement. And then I've been doing a lot of meal prepping. So I could basically have the same meal every day and watch my watch it count my calories. So it's not fun, but it works in terms of, but it works for me. But it does allow me to like when I go out with friends or on like, I could spoil myself when I go out and like break the routine it makes going out feel a little extra special. Zero out of 10 wouldn't recommend it, but it makes my life easier. Yeah, that

Ana Kerkovic 4:09
makes it make sense for you. I mean, it's fine. But before we get into the groove, I just want to give a short introduction. I mean to introduce myself a little bit. For those who don't know I'm on a CAD Kavita and I come from Serbia. I've been working in CRO for three years, but I'd have experienced in digital space and digital marketing for almost a decade. And that makes me feel kind of old. But on the other side. On the other hand, that's of experience right under my belt, which is great. One of my biggest interests in this field and in general is psychology like behavioral psychology and consumer psychology. And I just love how those principles can be trends. laded into webspace. So I think you kind of also mentioned something like this, that this is also your interest that drove you to this field. Is that right?

Shiva Manjunath 5:12
Yeah, psychology is one of the more fun things for me. People are weird people, they're unique and trying to learn exactly how they take what they like, what they don't like, is just interesting to me. If I wasn't a CRO person, I'd probably want to go into some form of psychology. Because it's just people are interesting and weird creatures. The glass half full is I like to manipulate people through websites.

Ana Kerkovic 5:42
Would you say we can sign up lating? Or just say it's motivating? Or persuading? Yeah,

Shiva Manjunath 5:51
yeah, yeah, motivation. 100%. Yeah, I help people do the things that they want to do. And it's not a definitely don't use black hat tactics to convince people to do things. So it's not I helping

Ana Kerkovic 6:04
them do the things that they want to do. But helping them do those things faster. And more.

Shiva Manjunath 6:12
It's a joke. By the way, for people who don't know me, I'm not actually a sociopath

Ana Kerkovic 6:17
to optimizers walk into a bar. But let's let's get into it. And I'm really interested in hearing your expertise on this. I want to know more about how you strategize and how you kind of move away from spaghetti testing, in terms of not just having not just testing idea after an idea, but rather having a more structured approach. Methodology, whether you have a prioritization method, or we don't, so tell me more about that.

Shiva Manjunath 6:57
So that's probably its own topic worth, like, many classes. But to summarize how I approach experimentation, moving away from spaghetti testing into like, more strategic testing, spaghetti testing is fine. When you start, I had been doing spaghetti testing, getting my program spun up, because it's, it's just a way to move through the process, try out some things if it's the first thing you're doing, you know, I joined my job, what, seven months ago, a month ago, and we just got our testing tool set up like three or four months ago. So if that's your mechanism to just try it, see what happens if you're new to experimentation, and you're just trying to see what takes move through the process. Spaghetti testing has a place in basic testing, and that's fine. It's like water wings have their place in trying to learn how to swim. But Michael Phelps wouldn't be using water wings, competitively. It's it will slow him down. He's a lot better than that you will graduate beyond that. So I know that I've probably gone away from this hard stances of like, these are the only ways to do testing. If spaghetti testing is your way to move into more strategic testing, because you are literally in a meta way trying out you're experimenting with experimentation. That's fine. You're trying to figure it out. We have all failed that experimentation. We've run tests that we thought we QA well. And then we're like, oh, here, we can fix our QA process. we'll iterate on that process, we'll figure it out. That's just inherent to what we do. That's okay. But if you're, if at some point, when you're running through your testing, all you're doing this spaghetti testing for like years, you probably have hit a local maxima, you're probably spinning your tires in mud, and there are better ways to do it. So a lot of what I think about with this experimentation kind of getting away from spaghetti, or I guess graduating if we use the water we're graduating. Yeah, my examples go a totally go totally crazy. Most of them suck, but I'll give I'll give them I'll give that one that was a good one. I appreciate that one good job. But the the watering example, at some point you graduate towards better testing and how to graduate into more strategic testing for me, at its core is just keeping research at the focal point of what you do. Spaghetti testing is cool idea. Let's try it cool idea. Let's try it cool idea. Let's try it. But when you run through research and you start identifying actual user problems, I created a pretty fun pretty impactful what I think, man I'm really, really boasting myself up here all the cool things that I've done, I promise. Romeo, definitely really shout this out. But when we talk about problem focused hypotheses, that to me is one way of up leveling away from spaghetti testing into strategic testing. Because at its core, you're solving problems that users face. And in order to identify those user problems, you have to conduct research. So spaghetti testing, a lot of times is solution focused, it's cold and cold and cooling tried, tried tried to try. But most of the times not backed in research. And if it is backed in research, you're not ranking it against other ideas that might be backed in more research, or stronger forms of research. So it's a long winded way of saying, basically, to graduate from spaghetti into more strategic testing, I think using problems as a mechanism through research is the way to graduate, right?

Rommil Santiago 10:50
This is Roma Santiago from experiment nation. Every week we share interviews with and conference sessions by our favorite conversion rate optimizers from around the world. So if you liked this video, smash that like button and consider subscribing, it helps us a bunch. Now back to the episodes. Now,

Ana Kerkovic 11:04
let's say that, that you have a pipeline, and you did. You didn't do your research, you went through your heuristic analysis, you looked at data, maybe you did some user surveys and whatnot. So you have an idea of what might might be wrong with the website, and you have some sort of prioritization. And you want to test a couple of hypotheses. And then of course, you have some tests in the pipeline. But let's say a couple of your tests finished running. So how do you incorporate these insights into the test ideas that are waiting to be tested? Do you just how do you do that? What's the next step? Once you finish your first round of testing, and you have some sort of knowledge?

Shiva Manjunath 11:53
Yeah, so I had Bob on my podcast to discuss bio frameworks. And those were pretty fun conversation. That's a very not subtle plug towards my podcast for me to be good. Find that wherever you find podcasts. But basically, but basically, the crux of that conversation is that prioritization frameworks are very good guiding principles. But they might not be as robust and following them to a tee, especially if you're in a nascent program, or you're involved with people who don't care about experimentation. And you don't have a true like, culture of being data driven or experimentation, it can be hard. So if you're the guy screaming, prioritize everything, privatize everything, you might be shouting into a win into the wind, and here like no one listens to you. Versus if you're saying, hey, let's do our best to try and test as much as we possibly can, and validate as much as we can. That's a more reasonable take that more people will be inclined to do. And that's where you start to get people involved in experimentation. So

Ana Kerkovic 12:58
it's like you said, a couple of times, approach the stakeholders, management, your coworkers with the I want to help attitude.

Shiva Manjunath 13:10
Totally. Yeah. I, I agree with those smart things that I said, Because he Yeah, I think that's just playing the politics game. And CRO, in order to get buy in, if you're just a dude yelling at everyone saying, Where are we testing it? Blah, blah, blah, purely on Mickey Mouse? Why are we doing those things, people are gonna like that they're gonna be annoyed that this one person just keeps on bugging them. If you're in a culture where like, top down, it's instilled that everyone follows that same of course, we want to test it. It's different. But when you are like one CRO, maybe you don't have executive sponsorship, or maybe you do but everyone is bought in. Yeah, it can be frustrating. So if you're just a guy that everyone looks to, as this person is helpful, this person's always willing to help validate, get me data do this. They'd be more inclined to repay the favor with a test. And then what do you run that test and it wins, and you makes them look worse? Yeah. That helps get people interested in experimentation. That's a Trojan horse. And I think that's where, especially when we think about like growing the experimentation program, if you're building it from scratch, or you're building it from like, one to five tests, and you're trying to scale it like that.

Ana Kerkovic 14:27
So you need buy in and you need trust. And but yeah, you mentioned frustration, and it's definitely a lot to have on your plate if you're one man if you're carrying all of the processes on your back. So speaking of frustration, because it may come from all sorts of places where you didn't expect you can you can run into bugs or push backs or delays or what Ever hindrances. So how do you deal with frustration? In terms of your everyday work? When things like that happen? How do you keep on track? And, you know, keep this educating perspective around your stakeholders or leadership? And how do you reconcile these things?

Shiva Manjunath 15:26
How do I deal with frustration in the work with CRO I go to the gym a lot? That is that is the easy answer. A more practical answer is

Ana Kerkovic 15:37
in the midst of it.

Shiva Manjunath 15:42
And it's a tough one, because I think it's very personality driven. And I think that's where politics is so important that there's a lot of like picking up on social cues and understanding people and getting to know them better. That helps. Because many people who are anti CRO, like many just don't know what it is, right? They see this best practice shortlist, they see, oh, we ran a button color tests for this one company and it lifted conversion rate by 7,000%. And they're like, hats. That's being stupid. Of course, that didn't happen. And yeah, maybe one person that did it, but people see this. And they're like, Yeah, of course, this is stupid. Like, I don't believe any of this. So there is that pervasive attitude? People read this. And you might be at an uphill battle, fighting it to be like, is this really something that's worthwhile doing? That's like, maybe personality one, they don't understand it. It's not any type of maliciousness. They just don't understand. So if people are if you're frustrated, because you're like, I'm having trouble communicating with them. That's a challenge. But understand that person one. Another personality type might be someone who's like, I know exactly what it is. And they might actually know what it is. And they're like, but I have an MBA, I am smarter than you. I read a book one time on UX design, therefore, I know all the things. And that's a different personality type. And I think getting to understand what is their adversity, because most of the time, it's not malicious, it's not anything against you, it's they just don't understand it. They have their own incentives. One of the things that really was eye opening for me is working with product owners in what's called a fast paced environment, which is like low key toxic anytime you hear a fast paced environment, like red flags, anytime you hear it, red flag, red flag you have, I went on Amazon and bought that just for that reason. When you're in a fast paced environment, and like when you're working in that type of thing, sometimes you literally are getting this top down push to say we have to ship quickly. These are the deadlines. And sometimes the product owner is like, I really want to test this so bad. But I have to just ship it like, I'm sorry. I care very much about what you want to do. I want to validate it. We're on the same team. I can't do it. Sorry. So sometimes that's the unifying moment for you and that product owner to be like, Okay, well, maybe we can't test this right now. But let's figure out the testing strategy from here. And that's where it's like, just be helpful, be that resource to help them. Because if instead, all you're doing is yelling at them for like, oh, it sucks, is doing terrible. And then they roll it out to 100%, because they got pressure from above, and it totally tanks, the site, and you're the Fool in the corner, like I told you, we should have tested it. Like they're not gonna be your friend, they're gonna hate you. You're also a tool, select, just be helpful. Just try and understand the personality traits, because that is helpful and getting buy in. You might be objectively right, in how you're talking about CRO, but being objectively right doesn't mean you're actually gonna get. That's right.

Ana Kerkovic 19:01
That's right. I think that what helps is knowing that all of those things that you are trying to implement or test on a website, let's say one of the biggest conversion factors like addressing trust or removing friction. I think you should also do that when it comes to people above you leadership or your product owners. You also have to instill trust in them. You have to tell them why they should trust you. You have to know how to talk to them. Like you said, there are different user personas you need to figure out figure out you need to do some sort of research to figure out who you're talking to. And yeah, and then just use it in the midst of it. Yeah.

Shiva Manjunath 19:47
So we go we circled back to the beginning of the conversation where I optimize everything that this is a meta conversation around optimizing for getting buy in into optimization, right? It's testing out different doing your research on the user persona for the person that'd be buying 100% Yeah, I think getting to like, versus if you just say the right things, I know what I'm doing. I'm just gonna roll this out to the site. Yeah. I'm gonna roll. I'm going to tell this to the stakeholder. Yeah, you might be right. But that doesn't mean it's gonna work. Humans are weird. Yep. Yep.

Ana Kerkovic 20:22
Yep. They're not rational beings. They're rationalizing beings who, who tend to?

Shiva Manjunath 20:28
Oh, I love that Romeo.

Ana Kerkovic 20:32
This great book that I read. It's called, I'm afraid that'd be from marketing has left for today. And it talks about changing human behavior. And it says that. So basically, you do you do your thing, and then you rationalize it. And then after you think after that, you think that you were a rational human being, even though you've made an emotional decision, and then rationalize it. So.

Shiva Manjunath 20:59
But I think that's the crux of all this is like most for the most part, everyone has some rational form of like, coming to the conversation here, as much as you might be like, This guy is kind of being told for not approving my test idea or not running the tests or not doing this, there is, for the most part, probably a reason that they're doing it. And it might be a bad reason, it might be the I'm smarter than you, I have a degree reason, I'd say that's a bad reason. But it might be the other one, like I want to test it, we're on the same team here. I just, I can't because it's, it's above me. And that sucks. Yes.

Ana Kerkovic 21:38
And also, when they do have an idea, maybe they just know their website, maybe they just, you know, did their research, they really want their website to succeed. So of course, I mean, we also need to trust them as optimizers. So there has to be some sort of partnership.

Shiva Manjunath 22:00
Well, you also think about like CEOs, when you work with folks who maybe you consider traditional hippos highest paid person's opinion. This is their, like blood, sweat, and tears going into the site, right? This is their baby. So if you're just like a lowly junior analyst coming to the site and be like, the research tells you to do this, I'm just gonna keep on using this Mickey Mouse voice. The research tells you to do this, you got to do blah, blah, blah. And they're like, Okay, whatever. This is my site. This is my company, this is my baby, you didn't spend all a mortgage in your life savings investing in this company, which maybe it's not logical, but if you know that's in them, like coming from that kind of emotional state, you partner with them and be like, bro, I get it. I want to make sure that, like works for you, too. I want to give you the data to help grow your business, because this is your baby. And, like CRO has a tendency to prove people wrong, and people don't like being proven wrong. Yep. But you have to approach it as like, no, no, we're, I'm doing this to make you look good. I'm trying to help you. And I think that's where I keep on going back to like ultimately, the CROs job and mentality I think should be. Let me help you. And if it's a test to validate, I'd love to help you. If it's quick, give you a quick data poll. It's not my job. I'll quickly help you. That's fine. I'll do that for you. All right.

Ana Kerkovic 23:25
But also, everybody should strive for objectivity. But that's the that is that is so hard that that is the hard part of it. But speaking of, again, we are still at the topic of client management. I'd like to know because one time, you mentioned that you don't like the term CRO to talk about experimentation. On the whole, because CRO focuses only on optimizing conversion rate. But at the end of the day, we are testing to learn something. So it's more about experimentation. So we're still spinning around here, but how would you how do you approach this? So getting the clients are working in this culture, to actually move from testing to experimentation, where you don't just work in a feature factory where, you know, quantity is the most important thing. You also want to go back to a test retested. If it you want to make sure it wasn't, you know, a false negative. You want to dig deeper. So what are your tools to do this in terms of course, mental models or just basic, hard working tools. So how do you do this?

Shiva Manjunath 24:52
That's a tough question, because the short answer is it depends. The longer end serve with some amount of nuance to not totally Chapada answering this question is, like, I'd say it's probably focusing on process. And I think it's around the education of process around. If you're all you're doing is validation of existing things that are built out. To be honest, you're in a better spot than many. And I'd say good for you for actually taking the opportunity to validate something, you'd be surprised at many companies that don't validate, they're just feature shipping factories that literally don't even validate. So if you are testing something, and you have that capability, being able to expand that to help on the test to learn mechanism, is going to be easier than if you have no amount of validation in any way. And the same with the other side that if you're perpetually testing to learn, but you're not incorporating experimentation into also validation of like, you know, site updates in, you know, Dev releases things like that product and everything. It can be it's a Swiss army knife, I don't want to actually I don't want to say swiss army knife, because most Swiss Army Knife tools suck like the stupid screwdriver doesn't fit on anything, the knife. It's like a bet. It's like a more upgraded version of the Swiss Army Knife where I can actually do quite a lot. Yeah. But I keep on going back to like process and education on the process side. And I guess teaching about process, using research to keep a cycle of like research, ideas, hypothesize, test, iterate, and like, keep it going over and over and over again. I think that's the most important part of what was your original question like? Not scaling? Yeah, moving from

Ana Kerkovic 27:02
moving from ops and optimizing only for conversion into actually experimenting for knowledge.

Shiva Manjunath 27:11
Yeah. So growing from test to win and to test to learn. I think one does also require a pretty intimate collaboration with UX, that the tests that you run, generate insights, and research generates insights, and then gives you test ideas. So I guess part one is have UX to collaborate pretty closely with UX. So those things help bridge the gap from test to win purely into test to learn. Test to win, to me is pretty synonymous with just spaghetti testing, where try a bunch of shit, see what works. And hopefully it wins. And if it doesn't win, we'll try the next thing and hope that wins versus learning. Foster's that, let's try and see what happens. And then exploring the why. Because digging deeper into trying to understand the why will ultimately get you towards much more winning than simply just,

Ana Kerkovic 28:11
of course, trying to see what's next. It seems that CRO is looked upon as as it is a magic wand that just keeps, you know, improving conversions infinitely. But at some point, you really have to actually start digging deeper to actually know who you're talking to at what point of time and what your users like, what they don't like, and so on.

Shiva Manjunath 28:37
Yeah, I would say CRO should be a process. She CRO shouldn't be a like, what's the word? Like? Focus on it being a process, focus on the process. If CRO is a list of things to do, then you're akin to this best practice nonsense. And yeah, you might win sometimes. But it's not a sustainable process, versus the process is sustainable. The process will work. The scientific method does work, which is why science uses science scientists use it because it works. And it's like explained biology. Well. Actually, that's a bad example. Yeah.

Ana Kerkovic 29:22
Yeah. And of course, it you have to keep translating that to everybody around you. So it's a process let's let's keep testing to learn and so on. So

Shiva Manjunath 29:36
let's not, I think that's important to like test too. So when we say, hey, if someone comes to you with an idea, what I generally do is take two steps back and say what's the problem you're trying to solve here? Because many times are like, cool idea. I want to try a cool idea. I want to try it. But take a step back and be like What are like what do you what's the problem here? Because most of the time they don't have a problem. Boom statement. They they're just like, I don't know, it looks fucking cool. Let's try it. And my prior framework, I very rarely prioritize tests or run tests that are back to no data. And it's like, Hey, cool thing. Let's try that. But if you take a step back, and then they say, Oh, I like it, because when I looked in the data I saw, I don't know, like, super engagement with this part of the page. And this other company does a really cool way of that thing. So because this gets so much engagement, I want to try it. And they're like, oh, great, you bridge the gap from problem statement into Hi Bo into solution. So loud bumping up as credit. And I guess I'm thinking about that CRO versus in that process. Otherwise, it's just, again, back to the point. Yeah, try it. See what happened? Yeah.

Ana Kerkovic 30:52
No, I like this approach. It's, it's optimistic. Of course, you have to first you know, go through trial and error, and then you get more mature. And then of course, that translates into into deeper stuff. I appreciate that. But tell me what is the what was the last aha moment that you had? In terms of whatever, you know, part of the process. So what was something that made you realize? You were not exactly on the right track?

Shiva Manjunath 31:29
Okay, well, this is some humble cake that I've eaten. I have dogged a lot on copy testing in my career. Like probably, I don't want to say recently, I feel like in the past three or four years, I've been very good about not doing this. But early in my career. I used to just be like, Who the fuck wants to test copy? Oh, yeah, it says, free trial versus sort of demo so much better. It's a different voice. Apparently. I like that shit. Like I used to be. So like, copy testing does not matter. And two things totally opened my eyes. One when I was working at Spiro, using winter to copy TEST totally opened my eyes of like, oh, it words matter. It absolutely does matter. And I my latest episode, I'm just this is a great plug. Another plug, super organic, latest episode for me to be I talked to Ed and about copywriting. And she's the absolute homie. She has so many great words about copy. I ran some homepage copy tests, and they have totally stunned me with how much of an impact they can have. And I think my to give myself credit, I think some of the things I was seeing as copy testing, were bad copy tests. Like, to that point it was let's try random words. Cuz why not. But when it's actually backed in research of users coming to your site, and your CTA says, like, get started. And you run a copy TEST on users, and you're like, hey, what do you expect to happen? You click Get Started. And it's like, no one has no one has a, what's the word? Like an ally? And an answer? No. Yeah, they don't. They're not totally aligned. Everyone has a different answer, then you're like, alright, this is unclear. Let's try being more clear. And then you try different words, that and then you copy test that and you say, yeah, that's more alignment, and then you run the ABX test, and then you see it wins. So TLDR, bad copy tests are still bad copy tests. But when they're backed in research, and when you're actually solving for problems, especially problems that users have on like, lack of clarity, or a bunch of mumbo jumbo bargain. What are the things that was pretty funny, actually, I'm gonna ask I'm going to turn this around and ask you a question. Okay. Because it's a fun exercise. If I told you the this headline on a website for a b2b company, okay. Name of the product category. Okay. The future is here.

Ana Kerkovic 34:10
That was in the podcast. I would say it's some sort of cybersecurity or cloud storage or something to do with technology.

Shiva Manjunath 34:26
Well, I technology is probably correct. But for you, like I could even sense reading this with you. You're like, I can't go any more specific than that. Because I have no idea.

Ana Kerkovic 34:39
Yeah, it's not specific. It's not telling you anything. It's just fluff. You know what to do with it. So tell me where to go tell them what to do next. I don't want to, you know, read stuff on your website. It doesn't tell me anything. I know what you mean.

Shiva Manjunath 34:54
Exactly. So I think part of it is like Yeah, well, perhaps you have super optimized copy. In your site, and if you've tested it and things are clear, then perhaps copy testing isn't the way. But I will say there's a lot of bad copy and I think spicy take with with more generative AI coming out, I've seen carp copy and noticeably get shittier. And it is more of a more buzzworthy. Or, like, it's all more reason that we need more actual human testing and copy is going to be a differentiator in the future words continue to matter and will continue to matter. I'm sorry, words do matter, and will continue to matter. So my aha moment is copy matters, folks,

Ana Kerkovic 35:44
this has been so great. And it's been giving me it's it's giving me a lot of hope. And it's it was a very what's the word? I keep wanting to say when to say optimizing. But I actually mean positive. We're optimistic. That's the word. It was an optimistic talk. So thank you so much for joining. And thanks.

Shiva Manjunath 36:14
Thank you for having me. And you did a great job for your first interview. Looking forward to listening to many more those

Ana Kerkovic 36:21
workouts so yeah, she was this has been great and enjoy your day and I'll talk to you soon. This

Rommil Santiago 36:29
is Rommil Santiago from experiment nation. Every week we share interviews with and conference sessions by our favorite conversion rate optimizers from around the world. So if you liked this video, smash that like button and consider subscribing. It helps us a bunch

Shiva Manjunath 36:45
Errol Neal, hope you're having fun editing out this portion. As you're walking through this, keep going. Appreciate you. You're doing great

If you liked this post, sign up for Experiment Nation's newsletter to receive more great interviews like this, memes, editorials, and conference sessions in your inbox: https://bit.ly/3HOKCTK

Previous
Previous

Always start by testing your best variation with Zach Lebovics

Next
Next

The truth about app development and optimization with Juliana Jackson