Product Experimentation: How to get Product Managers to Experiment ft. Shagun Aulakh

AI-Generated Summary

How do you get Product Managers to adopt Experimentation? Shagun Aulakh, the Director of Product Management – Experimentation at American Express shares with Rommil here tips.

Audio

AI-Generated Transcript

(00:00) for me experimentation is where you really don’t have any input um or there is really opposing ideas of what is the best solution to a problem and there’s not really any guideline there hi my name is Romo Santiago and I’m the founder of experiment Nation on today’s episode I have a great guest Dragon Alec the director of product

(00:22) management experimentation at a well-known credit card company today I pick sh ‘s brain about how she spreads an experimentation culture at a highly matrixed organization as well as how she tackles those who push back on testing and when one should and should not experiment we hope you enjoy the episode

(00:41) hi everyone uh my name is Romo Santiago and I’m the founder of experiment nation and today we have a great guest she’s the director of product experimentation at a well-known credit card company shagan Alec welcome to the show hi thank you thank you I’m a a fan I’ve been following for a while so I’m excited to

(00:58) be here so before we go into that I’d love for you to introduce yourself to our audience yeah for sure so um well as you mentioned I’m shagan um I am currently a uh director of product but um you know from a career perspective I started out in digital marketing um working mostly in retail and fashion and

(01:18) I often say like back when I was in marketing I had no idea what experimentation was um and I later found myself into an exp experimentation role kind of by chance um but since that time I found that it has always been one of the most interesting challenging motivating subjects that I keep going back to so since then I’ve been just

(01:35) basically traversing experimentation across different companies being in house being on platform vendor sides um and then kind of navigating to where I am today at American Express um and from a personal perspective um I live in San Francisco California I’ve been here for 20 years um I’m part of a community

(01:53) dance company that you know focuses on Brazilian styles of music so I try to balance my life between you know work and experimentation and then have some other space to not think about it at all I’m fairly jealous right now because it’s it’s kind of cold here in Toronto Canada so I imagine it’s slightly warmer

(02:11) in California slightly but there’s a misperception because everyone thinks California is all like Los Angeles and San Francisco is actually unusually cold compared to um other places so the fog isn’t there the fog that rolls in Carl yeah we call him Carl the fog yeah right it rolls in usually we call it June

(02:34) Gloom so in June the fog kind of Sparks up and yeah you always need a sweater and a scarf um for most of the summer but it warms up later so well as long as you’re not shoveling I don’t think uh yeah nothing like Toronto or the Northeast we never have snow I never have to worry about deicing my my windows or anything very fortunate uh so

(02:55) yeah thank you for coming show today we’re going to talk about um experimentation maturity uh uh spreading that kind of culture across product and marketing um so I I guess we could start with in your opinion uh what are some of the challenges that like you faced with product teams uh to help them Embrace

(03:16) experimentation uh you know because product teams are always trying to build uh better products but they’re time pressed right so how how do you get folks to even listen to you yeah um I mean well first I I’ll say this a lot and I don’t think it’s like the end all be but I think a big help is leadership

(03:37) advocacy um I think it’s critical that your leadership is advocating for experimentation setting kind of expectations just to I think that just kind of helps to um gel teams and kind of keep them focused on okay I need to start paying attention to what you know shagan is saying around experimentation

(03:57) that’s always help it doesn’t always happen um so let’s just say that does not happen um a lot of it is kind of like being a Salesman number one is you were trying to influence and persuade people to do things or follow processes or directions that they don’t maybe know they even want or need or maybe they

(04:17) don’t really want it so a lot of it to me is um education and what I call like internal PR public relations so um I focus first on just kind of connecting with teams and understanding I always first come from tell me more about what your needs are what are your challenges what are your goals and what I’m trying

(04:36) to do is look for ways that I can talk about experimentation as a solution or a help to the challenges that they’re facing so um you know for example if they’re talking about I have to deliver deploy certain features that particular you know fast intervals and I’m getting pressure to do that um I might try to

(04:55) talk about well you know rather than kind of waiting through the mud and figuring it out you can use experimentation to have more clarity and get to the best solution faster um so I might talk about those things but I sort of focus on three main things you know outside of just the pr and the road shows which is kind of the basic

(05:14) knowledge and understanding of what is experimentation um and like so for example today where I am each team has uh different maturity levels like some are highly independent they get it they want to run with it maybe they need a little bit of help and there’s other teams who are literally like I don’t

(05:31) even know what an AB test is let alone an AB versus like a multivariant and so you’re starting at like really 101 kind of um education um another thing that I’ve also observed in terms of that education is standardizing nomenclature um so for example I’ve been talking to some product teams who are like oh yeah

(05:51) we experiment and so then I asked them well what are you know can you tell me a little bit about some experiments you’ve run and what they end up doing is describing user testing um and saying yeah we focused on some you know small focus group and we showed some different experiences and got their feedback and I

(06:05) have to sort of re-educate them on that’s a type of testing but that’s not experimentation and why are they different and kind of what is the purposes of you know that has a validity and it has a role to play but it can’t necessarily replicate experimentation so I have to do a lot of that education um another big challenge I

(06:23) think and and you kind of alluded to it when you said sort of press for time but we talk about it um where I am today about Capa in terms of capacity so one of the big things especially with product teams is um it often involves having a need to build code and do things on server side and in the back

(06:44) end um because you’re dealing with features and functionality and so there’s a dependency on engineering teams and that is usually the biggest challenge which is we have limited capacity and I don’t think I’ve ever been at a company no matter how big where they’re like oh we have plenty of resources and plenty of time like it

(07:00) never ever happens so um it’s really about when it starts to affect that that bandwidth in that capacity of do I put the capacity towards the experiment or do I put it against just delivering there there in lies a conflict I think in terms of the challenge and then the third thing and it kind of maybe can

(07:18) relate to the capacity and the bandwidth is um a challenge of how do I incorporate experimentation into my development process so I think some product teams can struggle with knowing when do they run an experiment and how to manage and organize themselves to do it quickly and how do I build this into

(07:35) my process so it doesn’t feel like it’s this additive like task that I’m doing um and also kind of helping them to balance where experiments should be run to um inform as they go versus validating so a lot of teams May focus on the validation they’ve already built everything and now they’re just kind of

(08:01) looking at some metrics to make sure that they didn’t do something entirely wrong and so that kind of goes back to the education principle what I talked about in the beginning is trying to like Orient them on the ways that they can properly run experimentation and so that’s kind of how I combat some of the

(08:15) those challenges so you you touch on a lot of things and um I guess based on like the places I work if you you triggered me in the sense that like all these all these past uh memories came rushing into my head like for instance when you talked about noan claure I remember um without naming names obviously uh a very senior

(08:38) person went in front of the entire company talking about all the experiments they ran and they they defined a survey as an experiment right you know we we went we talked to people so we we ran hundreds of experiments I’m like you didn’t run an experiment you did research yeah right he did research great for you um I there I’d love to ask

(08:58) um you actually touched on a lot of things that I’m trying to decide where to start I guess the first place I’ll start is when you’re tackling a new orc and they new to experimentation and you you you have outlined all these things that you do who do you pick where do you start first you know let’s say it

(09:21) multiple lines of business multiple PMS under that yeah uh and you’re one person or if if if you’re really like a team and you’re trying to spread this uh culture who do you decide to start where do you decide to start first yeah I mean that’s a really good question and um I don’t always have one answer because

(09:40) there’s always pros and cons to how you might go about where to start first but I think generally for me if I’m trying to really build a culture and sort of a an embedded practice I actually often start with what I call like the power experimenting group the the teams that are either already doing it and know how

(09:59) to do it if if there is any team there um that I can use as sort of role model examples um or let’s just say you know nobody’s really experimenting it’s a really brand new organization I will look for teams that I think are highly motivated to experiment because I think um it’s the fastest path to actually

(10:20) even show the teams that are either more resistant or a little bit harder to kind of get on board when you show them what great looks like and you kind of show some of the great wins and learnings and how it’s shifting other teams it starts to create that motivation for um teams to want to to to participate it also gets you know a big

(10:41) piece of it is politics and getting in front of leadership so if you can get some of those quick wins faster um it starts to create a bit more momentum so that’s typically what I do so you know if I’m starting let’s say day one in an organization what I’ll do is I’ll try to meet all of the different teams and I’ll

(10:59) sort of question them on you know again what type what is their what I’m trying to get it what is their mindset um around experimentation and how much of a boulder am I going to be pushing uphill um and you know there’s lots of reasons why that uphill situation is happening but are they going to be pushing the

(11:18) boulder with me and like on the same team or are they going to be the one kind of like rolling it back down the hill for me so try to look for Advocates and people that would be Champions um so that’s kind of how I often start on the other hand on the other side of it so you’re looking for Champions um on the

(11:33) flip side there’s people who are very resistant um those that think it’s a waste of time I have to get things out yeah the is the only way to get them on board is to use Champions or do you have other ways to to uh get them on your side um I think so I’m trying to think if I’ve ever encountered a a team or a person that’s

(11:58) literally been like I think experimentation is a total waste of time and I would say and maybe I’m fortunate I haven’t had anyone like literally say that in those words what I often find is that you know product managers and even developers like they want to do experimentation and especially as it’s

(12:16) like grown in the industry it’s more you know well known than it used to be there is a like a desire to um but the challenge is more about this is Romo Santiago from experiment Nation every week we share interviews with and Conference sessions by our favorite conversion rate optimizers from around the world so if you like this video

(12:35) smash that like button and consider subscribing it helped us a bunch the prioritization of the work they have um and I not notice it’s more in like less mature teams but there’s it’s kind of like I’d love to but and that’s kind of where I see the resistance not so much of it’s a waste of time I guess it’s

(12:54) more like do I prioritize this or do I prioritize that and again I do think it comes when I notice that when I see that more I do see it coming from pressure from an outside expectation of but you know my senior leaders my counterparts like are all telling me I have to deliver these features in these products

(13:13) by like certain dates and I only have certain amount of Sprints I have to get things done and so I think it’s like a lot of that pressure of prioritizing time um and sometimes perceiving experimentation and product delivery as kind of exclusive of Concepts it’s either I do experiments or I deliver a feature rather than thinking

(13:32) about experimentation is a method and is a part of the process to delivering a feature so um I think how I sort of approach those challenges and I mentioned it kind of even in the beginning which is the education and just sort of teaching them the best practices around you know even what is the flywheel and the in the steps of

(13:52) experimentation but I kind of think about like working with them on their process because if it’s an issue about like capacity and stacking it against all the other product work they have um what I like to do is kind of first look at what is your current process of delivery today you know when and you

(14:10) know and what kind of time intervals or program increments are you um building your backlog are you doing Sprint planning when you’re actually then focused on the build and then when things get delivered and I’m trying to see where I can advise how and where experimentation can fit in a little bit more organically into the process versus

(14:29) saying you have to change all of that around because that’s a lot harder for somebody to take and want to move forward on versus you saying okay let me see where I can add little tweaks and you know what actually um you can just sort of maybe if you can add a Sprint to just focus on experimentation ideation

(14:45) and like reserve capacity for your engineers to do experimentation so it’s not like I oh I already allocated all my capacity to these other things so now there’s no room for experimentation and that’s no easy task it it takes a lot of time and like I said I’ve I’ve been dealing with product teams for almost a

(15:04) year almost two years and it’s still a challenge of figuring out where and how and I think it ends up not just even being a singular team you have to think about it from you know I think American Express is highly matrixed it is obviously a huge company so it’s not even just dealing with product teams

(15:23) it’s dealing with like product operation teams separately who set certain processes next expectations it’s dealing with budgeting teams who are allocating um that budget to different product teams depending on what they have planned so it’s a little bit of like negotiating with all of these folks and

(15:40) kind of talking about why it’s super critical um but I do think a big part I mean just in terms of getting motivation is that idea of democratizing insights and showing how other teams are using experimentation to make better decisions um because I do believe that there is a truth to like healthy competition and

(16:00) kind of especially if it’s a new te team who doesn’t know how to be able to show them a little bit of a a blueprint of like this is how it can be done and think of all the cool things like you can be able to talk to your your leadership about and ra rather than just saying I delivered being able to say I’m

(16:16) about to deliver this product that I can already with some certainty predict a value out of so I know where I’m going in the right direction um we were just having a convers ation at another meeting and we were talking about like there’s an SVP who needs more validation like we’re making big changes and they

(16:35) don’t feel comfortable about maybe some of the changes that we could be making are we thinking about all of the you know what is the drop off here and what’s the uplifts here and you know experimentation is a really great way to be able to articulate why you’re making the changes you’re making and actually kind of

(16:53) reducing fear from other holders and leaders so I think when you can share some of those insights and kind of show what another team has been doing I think that that helps them put it into more of a reality versus this conceptual idea of experimentation from from a tactical perspective um I usually see that teams

(17:14) are more ready um to adopt a validation approach where okay we’ve built the widget now we’re going to AB test the widget yeah and that concept I find teams can adopt pretty readily now obviously that’s at the tail end of several Sprints of work and they could be building the wrong thing entirely yeah do you have any advice or any

(17:40) thoughts around how do you pull that more forward into the development cycle I know you mentioned that you saw the cycle and you you you highlight areas that you can you can work on but how do you do that systematically like you you know you have to bring It Forward forward I was wondering if you had any advice there I mean I haven’t

(18:00) quite cracked the code myself to be honest it’s something I’m still working through but um what we’ve been talking about is how do we actually quantify there was a some industry statistic I don’t know if I read it or somebody said it and I can’t quite so I can’t quote what the source is but I have heard I

(18:17) guess through the gra Vine that there was some statistic of like out of 10 um features that get delivered um without experimentation backing it eight of them fail so like 80% of your features will fail if you’re not backing it through experimentation and so I wanted to take that kind of concept from an industry

(18:36) stats perspective and I’ve been talking with the team of like how can we actually go back and do some retrospective of the things that we’ve delivered without experimentation and have they actually resulted in what teams expected and be able to kind of show is this sort of 80% metric actually true here because then I can go back and

(18:54) say look this is why you want to only do it for validation because at that point think about how much cost you’ve already put in you know be it headcount and bandwidth and all of the things and you’ve delivered it and so if it doesn’t work now what do you do do you like pull the thing and scrap it or now think

(19:13) about the long road to try to figure out how to fix it and you don’t even know where to start because you may have like built this whole big thing um and so I was in a conversation once um where it was similar and and we were getting push back actually from the developer about running an experiment and um and he had

(19:29) a super valid point he was like why are we running the experiment when we already built all of the code we went through months of work and now we experiment for what and that actually goes to a certain point of where I also encourage teams to you you don’t experiment all the time everywhere for everything because there

(19:49) are points where you may not that’s not worth the cost of what it will take to run the experim I want to jump into that one that’s something that’s definitely been on my mind because you hear in the industry test everything and that’s usually the cro conversion rate optimization type FKS and then the

(20:06) product managers are like oh hell no we can’t test everything especially developers say the same thing I was wondering how do you differentiate between things that you should test and when you shouldn’t test um yeah yeah I think um so the simple thing that I first ask is if you experiment on this and let’s say it doesn’t whatever

(20:26) thinking of building and running as your variant if it doesn’t perform the way you expect are you implementing it anyway and You’ be surprised that oftentimes teams will say well yeah because again I already had it in the road map it already got signed off it’s actually uh push from um I don’t know some strategy team that we have to get

(20:46) this out so in that case I would say well yeah you don’t don’t test this you’re already building it maybe you can use some other ways of like analytics um again user testing or uh heat mapping or whatever different analytic um inputs and you do that but then what you could do is experiment and again this is a little bit of the

(21:06) validation what we were just talking about so um you wouldn’t even experiment to validate you would just hopefully use whatever insights you can to feel some confidence that what you’re doing is the right thing and then what you would be doing is building a plan for okay as soon as this thing launches what is my

(21:19) experimentation road map look like then what learnings do I need to figure out what um interaction effects of the changes I made do I need to understand and then you have an experimentation plan after that so that’s kind of like one simple question that would help direct me in whether not you you can run

(21:35) the experiment another thing is you know it kind of goes back to when I said you can look at other analytics is like can you get at because to me experimentation is about confidence right it’s like is the changes I’m going to make um you know it’s about certainty and predictability but sometimes in usually

(21:51) when you have a really strong analytics teams you can do some analytical rigor other ways to gain confidence around a decision um and so I kind of focus for me experimentation is where you really don’t have any input um or there is really opposing ideas of what is the best solution to a problem and there’s

(22:13) not really any guideline there so those to me are the more Prime examples of where you would run an experiment so you’re prioritizing things that you’re you’re fairly uncertain about yeah exactly because that’s where I think you’re you know you’re really kind of gain insights to like lead you in the

(22:28) right direction so I think you know we talk a lot about it in terms of um you know that kind of concept and I think where you hear a lot about experiment on everything everywhere is like you mentioned cro and where it’s more like optimization so you’ve been running things and you just want to kind of check and tweak things along the way

(22:48) and I think to me I think of it as a stock portfolio it’s like you don’t want to just put all your eggs in like just small iterative changes and you don’t want to just do huge giant overhaul changes like there’s a mixture of what’s appropriate for every organization and it’s kind of just thinking through where

(23:04) um where is experimentation going to help me be more informed to make a better decision um I know it’s kind of like a vague statement but that’s kind of where I I try to approach it from that perspective I also think with product teams like like I said it’s a little bit of politicking I don’t want to be like you got to experiment

(23:22) everywhere all the time for everything because it’s immediately going to be like a door shut in my face because that seems too overwhelming and it’s too much of a shift from what they’re doing today so to me it’s like if I do believe that there is a place for that I gotta evolve them there I can’t get them there like

(23:38) straight away so like the other thing that I always run into is I have to be mindful if the test goes poorly or I just made this person run the test and the test runs poorly this might make the person look really bad and then they’re not going to want to do it again yeah um so yeah just I can I can empathize there

(23:58) well I think it even like what do you well my question actually to you is like when you say if a test is poorly it can make them look bad how are you defining a poor test yeah I set myself up for that so you’re supposed to look at learnings and stuff but let’s say you’re looking at the validation side of it

(24:12) like you haven’t gotten into the testing the RIS riskiest things um a lot of research has been put into a particular feature they spent time into it and then you’re like you should know how it performs you see should see how it impacts the the business before you unroll it yeah release it and they find

(24:27) out oh well it hurts business by 5% okay so what do we do now yeah it doesn’t feel good what I say is like wouldn’t it be worse had you rolled it out and it hurts business at least now you get a little bit of a you know what to prepare yourself for versus I think logically that makes 100% a lot of lot

(24:46) of sense but um it goes back to the first thing you said when you don’t have senior leadership uh bought in that gets really tricky because if they’re not bought in they’re like just ship the thing yeah um we’re good ignorance is bliss so you know but like it’s it’s hard as data Centric people like really

(25:05) you’re gonna you do something like that it is um so we talked about convincing people talked about um people who who are resistant where where to change and and education all that now let’s say you’ve been um spreading this culture for a while when do you know you’ve done the job uh because you could do this for

(25:28) a year or six months five years what have you when when is it done or how do you even measure that it’s you’re there yeah I guess I’ve never been yet at a place where I feel like I’m there so I I feel like and for me I think again everybody’s going to be different every organization is different so it’s going

(25:47) to the answer isn’t as straightforward um for me like I said it depends on the organization its needs its expectations um and if you are the owner of experimentation or the one who’s driving that um I think it you have to first Define what success looks like because again that it’s it’s different and I’ll

(26:09) go into a little bit of like the way I Define it but I think one sort of preface I’d like to make is that um I think people in general like to look for examples of what they need to be right you know like for example we’re and I’m guilty of it like I’ll look at companies like Netflix like Google Facebook Amazon and I think that’s a

(26:32) highly mature company that’s a successful experimentation driven company I need to be like that but the truth of the matter is that most companies I’m going to argue are not like an Amazon or even like a Booking.com um often they don’t have the the same kind of resources they don’t have the same culture they don’t have

(26:53) the same budget um and so it’s really hard to look at that and say that’s what a mature team looks like it definitely is I think they’re like the best of the best and there’s ways that you can be inspired by them but I think that’s why I go back to like you as an owner need to really think about your own company

(27:07) so you know for example in financial services and this was a big like aha for me um because you know having worked in retail and worked in health and wellness like there’s a lot more flexibility in financial services it is highly highly regulated and a lot of what teams are doing is purely based on compliance and

(27:24) Regulatory laws not even because it’s what’s the best thing to drive revenue and the metrics is just that that’s it’s a regulated space right so I had to kind of reset my expectations of what success looks like in a financial services company versus what I’ve had experience at at at other companies that maybe

(27:44) don’t have those same challenges um another thing that I’ll also kind of plug is that there’s a lot of companies um from like Consulting companies to like vendor platforms like optimizely um speo is really great where they actually have maturity assessments you can take and help uh Define for yourself a little

(28:03) bit around like what are some of the parameters of what mature could look like and where am I stacking up so you know if you’re kind of thinking about where to start um I think they’re really great resources to just kind of Benchmark um yourself against but for me I look at a couple of different factors

(28:21) um as I mentioned it’s like clearly defining success metrics for your program um so for example what I might do is I try to map out kind of like a like a Five-Year Plan of in year One what are the metrics I need to look at to think about like starter level success and then in year two how do I want those metrics to evolve and all the

(28:44) way up to like year five and I say and five years is sort of arbitrary for me it’s just kind of like a way that I can look at like a longer window without going so far out that it’s complete you know speculation at that point and so so you know for example with a brand new team I think the most commonly used

(28:59) metric is velocity um and that makes sense because you know you’re just trying to get like experiments at the door and I think velocity to me is a bit of a measurement of habit are we in the habit of at least running experiments whether the experiments are driving insights whether they’re having any

(29:14) impact on the business whatever I just want more of that happening so that it’s becoming a little bit more of a practice within the teams um but as you start to like use that velocity to learn insights of like oh here’s what works and what doesn’t and maybe often that velocity is driven out of like small UI changes and

(29:34) you know easy stuff that you can quickly move out um fast but as you get more mature I might start to plan in the two to three to four year marks things like win rates how often are the experiments that I’m running um actually winning and again it’s not to say that all your experiments should win because we all

(29:52) talk about it like failure is a is a you know a good thing sometimes in experiments but it’s more like are we using the insights of an experiment and are we using learnings to drive the next insights for experiments so you’re actually leaning into the things that work and finding more wins that way so

(30:06) win RS might be another metric I’m looking at um how many variations per experiment and are we running so are we uh trying more things are we opening ourselves up to more possibilities um there’s other things even like we talk a lot today about time to Market so it’s like speed of execution are we able to execute tests

(30:26) faster um one example it was sort of crazy and insane but back when I was at a retailer it was taking like 22 weeks from an idea to when it launched and that sounds insane um but it was because of it was about a process it wasn’t actually like the building and stuff it was because it was based on a seasonal calendar uh process

(30:47) that was just the way things were always done at that retail company um and we were like in 22 weeks like what we would find we’d have all these ideas and by the time it gets to like all right now The Season’s coming let’s we’ be like these are all Obsolete and old ideas and we would end up not running anything

(31:04) because we had to rethink it so there was a lot of work in terms of how we evolved but as we became more mature we reduced the 22 weeks to for complex tests it was around four weeks for like simple UI changes obviously it’s a matter of hours or days um so we went and had that Evolution but we were that

(31:20) was like one metric we were looking at um in terms of as we got more mature and then I would say the other kind of metrics that I think about are again if you’re in a team that is supporting lots of different teams like in a product organization it’s I look at what is sort of the distribution or spread of

(31:37) experimentation across teams so I might have a really high velocity but if it’s coming from one team am I really mature as an organization so I’m trying to look at are more teams experimenting and kind of onboarding with it and then slowly over time of th each of those teams then is their individual velocity starting

(31:55) do you break that down by lines of business or teams because let’s say you know people who are doing the front end and stuff like the the button changes those are very those are very quick but like backend changes Ledger changes stuff like that that gets a little bit you know more more complicated regulated

(32:11) yeah um do you set the same bar for both or you just kind of have a feel of like what the entire portfolio is doing um I don’t set the yeah I don’t think that they should be approached the same because to your point um the more complex the more back systems and the more you know engineering work that’s

(32:28) required you just simply can’t move as fast so I’m not going to compare it to somebody who’s doing like you know just uh layout changes and simple like color imagery changes um so I generally what I’ll do is and it’s kind of not the greatest science but I first just take an average of like where are we today

(32:45) the average team is running let’s say one test a month and I do come up with just sort of arbitrary like we want to if you’re at one like it’s like singular I’m like let’s just double it let’s just get you to two so I’m just kind of making making some arbitrary like just goal to kind of get them to but I won’t

(32:59) necessarily compare them against each other and say like team one you are only running 10 tests and team two you’re running a 100 so team one you’re not as good at it would be more about like I kind of take it as like each individual team and treating them a little bit like maybe it’s like grouping we actually

(33:17) been talking about this too which is um you know my team we have a Coe and we don’t actually execute the test we’re just supporting the product team to do that and so um we talk about and we have like a lot of product teams we’re supporting and we’re a pretty tiny team so um we’re talking about like sort of

(33:34) segmenting what we call our users which is like the internal product teams and saying like who are like product teams um that have similar maybe similar goals similar um organizational kind of setup maybe they have a full scrum team ready and then there’s other teams that um don’t have technical teams and there’s

(33:53) um maybe again they’re less so we’re trying to segment these users and kind of create goals Associated to that segment um just like you know I think if you’re in marketing you don’t have maybe the same expectations of a you know tenured customer who’s been with you for 10 years versus a prospect right like

(34:11) there there’s different goal posts you have for them so I I kind of approach it the same way so actually the the thing that you mentioned where you set a goal for a team who’s run one to two versus uh another team how I’m very curious and you don’t have to tell me if you can’t but what kind of authority does your

(34:34) team have to have make a team because so I I’m very invested in this question because in my last role our our our setup it was very similar to yours Coe didn’t run the tests we helped everyone encouraging and all that jazz uh what we didn’t have was Authority yeah where we’re like you really ought to test a

(34:54) thing and the PM can go was able to go no we don’t want to and and then you we had kpis and it’s like well no one wants to help us in this so it made things a little bit challenging I was very curious on um The Authority that you have or don’t have uh and how do you manage that yeah I feel like there should be a therapy support

(35:16) group for folks like us who fit in this situation because there are days where it’s a struggle so I will say you know we don’t have explicit authority of like shagun’s team is the one directing the goals and you have to follow the goals she’s setting or the recommendations that she has like we don’t have that and

(35:33) it’s not I think it’s rare that you can get that um I’ve been lucky in some circumstances that I have had that Authority and obviously so you’re saying at at the job interview to demand that you have some Authority is that is that what you’re saying if I could go back in time I would that would be one of the

(35:52) first questions I ask is what kind of explicit Authority does my team have yeah um and are we able and if there is none what is the likelihood of getting that Authority and so again for me I I’m constantly working back to leadership and and it’s not just one leader I’m dealing with like 20 different leaders

(36:12) who all have different perspectives and I’m trying to just find consensus um I actually sometimes waffle even should we be the ones really dictating goals like what kind of velocity because I don’t live and breathe your day-to-day and um you’re obviously accountable for your own business so it is kind of like

(36:32) some outside person just coming up with these arbitrary goals sometimes it may me feel like to them um but I do think that there’s importance for and this is actually something I’m going to be talking about at a upcoming conference which is Coes in general um and they’re great in some aspects but if

(36:49) you’re not given explicit Authority and I’ve seen this at so many different companies where the Coe sometimes ends up being like a um I don’t know what you want to call Duck yeah like you’re just sitting around like hey you should probably do this no okay next team hey you should probably do this and and there’s not

(37:09) really a like there’s not a secure place with where they sit and what they should be accountable for driving right so yeah um I’ll go off on a little bit of a tangent but I think what I try to do is just highlight that the Coe team is specialized experts like we’ve been doing this for a really long time like

(37:27) your job is product your job is to deliver the features and you should not necessarily have to think about all the different experimentation nuances let us do that work for you so that it actually makes it easier for you to run an experiment um so I think right now like the goals that we try to they’re not

(37:44) goals we are setting it is goals we’re suggesting and trying to give them some boundaries to work within um and again it works for some teams it doesn’t work for other teams um and sometimes it may not be like a velocity goal and maybe I’m not going to give them the goal and velocity what I might try to do is just encourage that

(38:02) whatever number of tests they’re doing um is it actually helping them make smarter decisions about the the features that they’re shipping um so and like are am I helping them evolve in terms of complexity that’s like another Benchmark I think about in terms of maturity you know we talked about like simple button

(38:22) color changes those are great but are we doing more kind of Journey testing or like what I call like multi-threaded experiences or we’re testing um you know cross device experiences things like that are we trying to move more in that direction so we’re driving bigger insights and so it’s more of like today

(38:39) it’s a little bit more like how can I be inspirational and eventually as we gain credibility that Authority comes as you build the The credibility this is a bit of a change of topic here but you mentioned uh previously that um you’re part of a Coe and your part of your job is to educate at least that’s

(39:00) one of the the things that you look at is to educate get them to a certain level of familiarity with the nen clature and and the the concepts of experimentation how how much education do you how much do you want PMS to know versus what the Coe knows because from my experience um I feel that when I tried

(39:20) to educate uh and they’re very smart people but it doesn’t stick mhm uh and there’s turnover and and then you have to train them again so I was wondering what level of education do you seek and how do you maintain that level of Education yeah I mean I come from because and I think again it’s because

(39:41) we have a very small Coe supporting like over 50 different product teams so I can’t gatekeep the knowledge you know just for even for just solving some of the challenges of like turnover like what’s the point if people are moving in and out and all that but if we do do that we are buried right we’re going to

(39:59) be running around from Team to team it’s just really not scalable so I do have a perspective and um of like train the train like not train the trainer but like uh teaching them to fish right like I need them to I try to teach them everything I know because while I know it won’t stick hopefully some percentage

(40:18) of it will stick and it’ll just make it a little bit easier um but one of the things that we do is um usually kind of it’s like with more newer beginner teams who’re kind of like we don’t know where to start we’ll do a lot of like handholding one-on-one sessions of education and a lot of it is more centered around how do I just get the

(40:36) test out the door um so it’s a little bit of like technical Hands-On keyboard how to use the platform things like that and all of all of that kind of stuff and then I’ll try to the team will try to kind of flush in principles and strategies um but what we actually are trying to do today is we’re building um

(40:53) kind of like a a Wikipedia um which where is basically a knowledge Hub and so what we’re trying to do is create some self-service ability for um more of the most common questions scenarios situations documentation we’re trying to put it into a centralized place that when they come to us with certain needs we can just say hey you

(41:12) can self learn through this forum and where it comes to really customized discussions that really require some like sit down brain power then we’ll we’ll engage in those discussions and so that could be another that’s what we’re thinking of other way to keep training and knowledge going um and then for

(41:29) example product team a has a new person who joins they can think about using this knowledge Hub as a way to kind of get that team member onboarded we want to evolve it more even to like a certification program of experimentation you kind of go through actually a formal training that can be self-d done um and

(41:48) then they if they go through it then it’s kind of like you’re blessed to go and and run experiments with less oversight who who touches your your um your tooling do you trust them or or is the center of excellence that touches it um no we it’s the teams doing it and the product teams and and uh developers are

(42:08) in the tool and running and setting up tests um because again one of the things like I found and this is kind of going back in my past lives of um retail and we were kind of a centralized team and then we became a Coe but still very centralized so basically all the experiments went through us we were the

(42:25) only ones in the tool and we would kind of run everything and even report out on results and it worked really well um but then it worked so well that we actually had a great experience where all these teams wanted to experiment and we became a bottleneck and we couldn’t we just couldn’t run it the same way so we had

(42:39) to start Distributing and decentralizing certain teams and that’s where again you kind of went through a process of we’re teaching you enough where we know that we’re confident that you can touch the tool and you can do that and you would have checkpoints with us and that was kind of like the way that it should be

(42:55) I think um when I joined where I am today it was some of the ship has already sailed like teams are already in there so it’s hard for me to be like you’ve been doing it on your own for a year before I joined and now we’re going to strip that right from you so what we’re actually trying to do is go back

(43:11) and look at all the teams that have access to tooling and um sort of audit and say which teams do we know we have confidence they know it and other teams that we’ve had less interaction and we’re trying to encourage them to come and just do some onboarding training to make sure that they know how to do it

(43:27) properly but we’re often hitting walls like we’ll look at it they’ll be like this whole test like we’re running into these bugs and errors and this thing doesn’t work we’ll often hear like the tool doesn’t work and then we go back oh my God yes you didn’t implement it properly and you set things up wrong and

(43:41) you missed this whole thing and so that’s where we went back and we’re like they need to have some basic um education I feel that so so hard where if anything went wrong anything anything it was always the tool and then we had requests of like there’s a particular way to implement the tool that we had and they’re like can we call

(44:00) the functions in a different order and it’s like no it’s literally my dayto day we have conversations like that constantly it’s like oh we changed the order of the calls but um P the tool doesn’t work or the tool is making the page uh having more latency than before it’s having page performance impact and we’re like

(44:23) but you changed the order so it’s not the tool you changed the implementation and now you’re having adverse effects to it so but again what I in teams often like my team sometimes feels a little bit like demoralized and frustrated like you know it’s always being blamed on us but I was like look if we want them

(44:41) experimenting yeah it’s sort of the gift in the curse like we have to sort of take accountability and and just say like okay like let’s let us help you we’ll figure it out we’ll help you with that um because if we’re just like sorry you you did it wrong and you’re on your own it’s just going to end up becoming a

(44:57) thing where people stop experimenting so um it’s again I don’t have a great answer for how to combat that but it’s a common problem especially other teams do it so well I think again it goes back to if we can create like a guild or Like A Champion like people that we know have like gone through it they understand it

(45:17) they get it and then they go out to they live within those teams then I think it’s we’ll have less of of those scenarios I I definitely in my past have had the most success when we’ve had a handful of those folks and they question everything like why isn’t this a test and they share the results and they

(45:35) inspire people like look it doesn’t have to be perfect we’ll just iterate into the the solution we’ll figure it out and and uh I think one of the hardest things is when we lose people like that yeah um that’s why I was talking about like how do you keep re educating because when you lose a champion then you’re like oh

(45:50) well this the program set back and you have to to start over and it’s kind of like one of those things where your team gets a little bit demoralized as you mentioned um but at the same time like if everything was perfect we literally wouldn’t have a job so you’re gonna have to that’s true everyone’s running

(46:07) experiments perfectly like great you’re fired right so in this is a very off tangent but I remember my one of my jobs where I did this sort of thing I thought to myself well I should be working myself out of a job in two three years if I do this right do I want to do this this very well or not should I sabotage just on purpose to

(46:29) ensure I have job should we go slower I have too much pride to do that but you know the thought did cross my mind um we could we could talk for for a long time and I definitely uh peppered you with all sorts of questions and it was very cathartic because I I feel like I’m I’m living through you at the same time like

(46:46) I’m reliving all this stuff um but I want to give you an opportunity to share with our audience things that’s going on in your world and uh things you want them to know yeah um well again like before I I kind of answer and wrap that up again thank you for having me here and I had a great time um chatting with you it was super

(47:04) cathartic for me as well um so like I said I think there should be a support group of of experimentation we we’ll form it right after this yeah um but as far as what’s going on with me and what I think it’ be great for your audience to know is um I’m actually going to be out in June I’ll be in Europe um I’m

(47:22) excited to be participating in the experimentation Elite conference that’s in Birmingham in England um in June and I’ll be uh doing a a keynote there actually around Center of excellen is and kind of what does that mean and is it the right thing for um an organization and then I’m actually going to be um after that I’ll be hopping over

(47:41) to Germany in Frankfurt and I’m going to be at the growth marketing Summit also doing a keynote there on um my favorite topic which is leadership buyin um again can have some demoralizing stories we can all relate to but I hope that it also inspires folks of after each one you’re drinking just a little bit more

(48:00) yeah I mean that’s kind of how I think about like whenever you know there’s topics at conferences and if I have the you know privilege of speaking at them I tried to pull from um what has been actually difficult things for me where I want to throw my hands up and be like forget this and then if I can come out of it hopefully

(48:20) Inspire other folks to to kind of work through it because there’s usually light at the end of the tunnel but anyway those two conferences are coming up in June um so if your audience is going to be out there by any chance um at in uh in England or in Germany um feel free to ping me and let me know I’d love to meet

(48:39) them amazing um and and I’m hoping that our audience checks you out and signs up and uh you know feels the feel the pain that you went through and and hopefully get light at end of it I just wanted to thank you for coming on the show um and uh and yeah thank you so much yeah my pleasure thanks this is Romo Santiago from

(49:00) experiment Nation every week we share interviews with and Conference sessions by our favorite conversion rate optimizers from around the world so if you like this video smash that like button and consider subscribing it helps us a bunch

If you liked this post, sign up for Experiment Nation’s newsletter to receive more great interviews like this, memes, editorials, and conference sessions in your inbox: https://bit.ly/3HOKCTK


Connect with Experimenters from around the world

We’ll highlight our latest members throughout our site, shout them out on LinkedIn, and for those who are interested, include them in an upcoming profile feature on our site.

Rommil Santiago