People are the biggest challenge featuring Michael St Laurent
AI-Generated Summary
Michael St Laurent, a seasoned CRO expert, shared his insights on large-scale testing Here are 5 key takeaways for marketers: Prioritization is key: Large-scale testing requires prioritizing tests based on potential impact and aligning with overall business goals. AI is the future, but humans guide it: Mike emphasizes the power of AI in automating processes and analyzing data, but highlights the importance of human expertise and guidance. Data quality is crucial: AI models rely heavily on the quality of data. Invest in building a robust data infrastructure to ensure accurate results. People are the biggest challenge: Navigating internal politics and managing stakeholder expectations is often the biggest hurdle for large-scale testing. Building a testing culture is a win: The ultimate goal of large-scale testing is to create a data-driven culture that fosters continuous improvement and better user experiences.
Audio
AI-Generated Transcript
(00:00) in every one of those phases to try to make an improvement either to our ability to do that work faster or can we have a better quality output and using a combination of either AI or automation to try to improve those things is sort of what we're doing so I'd say like on the automation side the most common use
(00:17) case that I can give as an example would just be presentation Generations [Music] welcome back to a new episode of experiment Nations podcast I'm your host Charlotte Bumford and along with me today is a very special guest his name is m Mike SE lauron and he is a managing director of conversion.com so Mike do
(00:44) you want to introduce yourself further to the audience sure uh thanks for having me first of all um yes my name is Mike and uh I'm the managing director of conversion North America the agent and a little bit about myself is I grew up in Vancouver Canada where I still live today and where we have one of our
(01:05) offices um and I think the I guess the most interesting way about how I got into the space was just initially through education like when I went to University was studying uh psychology and some economics is like the two fields that I was interested in uh and sort of wasn't able to decide which one to to go into and then ultimately I
(01:30) ended up taking a marketing course and that kind of was like the real Catalyst that changed everything for me that yeah it sort of was like the aha moment that marketing really is the culmination of psychology and economics together true yeah so from that point forward I I completely switched uh went deeper into
(01:49) marketing and you know sought out the the right marketing school for me and then uh went and you know completed my education doing that and that was think the first turning point for me that got me closer to the space and then uh at the end of that I eventually uh did a few internships and tried out a few
(02:06) different marketing agencies for you know brief stance of time and then eventually found a conversion where I I still work and have been there for the last 10 years oh wow that's been a long time so you didn't really go to experimentation right away it has like you know marketing has a lot of umbrellas so you did like SEO or
(02:26) something like that more on the S of advertising and then went to experimentation yeah a little bit in like events I tried for a bit which I was terrible at and then uh yeah just like a classic advertising advertising agencies and then yeah eventually I uh when I saw the the website initially and I thought you know this is I I was
(02:46) almost in shock of like how has more people not thought of this as a you know something that's important for businesses to do and uh it was so clear to me back in like 2014 I was like this is the field I have to go into and I really feel like this is going to become very very prominent part of business and
(03:05) I want to try to be part of you know creating that so that was initially what what got me in there wow that's actually pretty big wow that's interesting because uh yeah most people um in the experimentation field probably have the same you know Pathway to the experimentation um anyway conversion.com
(03:25) big agency and I'm pretty sure you guys have huge projects like large scale projects so I want to go into that and dig deeper on how it is with large testing large scale testing projects um and uh yeah I'm just gonna check like I'm not sure if it's confidential or not like do you guys have like a um like per
(03:52) group of people or experimenters how many pro are allocated um to them uh well that varies dramatically I would say um I guess just to give a little bit of background on on conversion like we are I think one of if not the biggest agency in in the world doing testing so we're well over a 100 people dedicated just to experimentation
(04:18) and um give a sense of scale like I'd say any day on our any day of the week we have you know over 500 EXP experiments in production simultaneously that are being worked on either in like strategy design development you know live analysis um all sorts of things so um for a company like ours we are more of a
(04:40) low volume amount of clients but work with generally like larger projects in in the market typically skewing towards Enterprise companies and so uh I think we you know might only have 20 or 30 customers at a time but then some of those customers will be companies that are just starting and they may be small
(05:01) and someone might be on to your question like three or four different you know Accounts at once um but then we also have accounts that are multi-million dollar annual retainers and it's it's they're like massive testing Investments that have tons of Staff they have infrastructure they have processes um and you know those are huge
(05:22) which which will sometimes have 10 people on One account not even like one person on 10 accounts it's like the the opposite oh wow wow so uh just because I've been to and I've spoken to a few people who have like you know the small medium siiz scale testing but I'm actually quite curious I'm pretty sure
(05:43) that all the audience right there out there is um curious as well on what's involved in like the multi-million dollar retainer face and the large scale testing this is Rommil Santiago from experiment Nation every week we share interviews with and Conference sessions by our favorite conversion rate optimizers from around the world so if
(06:03) you like this video smash that like button and consider subscribing it helped us a bunch now back to the episode yeah it's a bit of a a black boss because not everyone gets exposure to it so it's uh always interesting to talk about I think there's there's really just like so much I think that um you can imagine for starters that a huge
(06:21) piece of it is just running more tests um and so that that's a massive piece is that companies that have that level of traffic like you think the fortune 10 Fortune 100 sized companies that you know they have tens of millions of visits a month the velocity ability to run tests goes way up and so right a
(06:41) huge portion of what goes into that is just high volume and so right that'll be scaling up from you know three four tests a month to five tests a month to 10 tests a month to 20 and just you know increasing the velocity of of that work um and that's one piece of it and I think that's the piece that's easier for
(07:01) people to understand because if you're working at a smaller scale that's what you're doing right you're running experiments um but then the Unique Piece that kicks in when you're starting to think about a large testing agency is that or a large testing company is that there's a whole piece of like organizational behavior and
(07:19) organizational design that goes into it and so I like to think of it as kind of there's two levels that there's the upper level which is really about how you design a testing program which is like an operational system that's installed in that business and then the other half is the actual production of
(07:36) how are you creating and and making experiments um and so some of these questions are like you know where should testing even fit in an organization if you're investing multiple million dollars into running tests effectively and you have an organization with tens of millions of visitors and you have
(07:56) thousands of employees right it's not just like that's just a cro person there's designers product analysts there's all these different you know departments that have to touch that and you know even the question of such a simple one like where should who should even own testing like should it be in product or should it be in marketing or
(08:16) should it be in analytics huge one yes should it be in engineering and so we really encounter and have to work with the customer to sort of decide where is the best place to put this then even working on like what kind of roles and staff should you hire what kind of processes should exist um what kind of
(08:33) documentation your technology stack and there's sort of all of these you know the like the operating system that goes into testing that um is the piece that's different um to what people might be used to oh interesting so um how would you how do you guys initiate this process now this client let's say like I have this you know x
(08:56) amount of visits per month this is my budget how do you how do you guys start yeah so the first step is just understanding the goals of the business that's that I think is always a good place to start for any sort of question like this is just what are they looking to do and to what scale are they looking
(09:14) to get to um because testing is not a one- siiz fits all like there's you know this is the way you should do it there's a ton of different use cases for doing experimentation and for some companies that's just that they want to drive Revenue growth and increase their conversion rate like a classic objective
(09:33) sure and for others it might be that they're doing an entire redesign and they want to use testing as a way to like drisk different choices that they're making in their redesign um sometimes that we have companies that are wanting to launch an entire new product line or an entire new business and they want to actually use testing to
(09:53) test the market and see what people respond to and start to build build new funnels and use it as like an innovation platform and so that's really the first step is like what is the goal and that's going to determine sort of how to design a good system inside of that company that can answer questions for them and
(10:12) we're also trying to get a sense of like how much risk does this business like are they willing to take you know are they in a hyper growth startup mode that they need to take a lot of risk and they're trying to find a position in the marketplace or they like a super established Blue Chip Enterprise and
(10:30) they have lots of market share and now they're just trying to squeeze more out of the position they're in yeah and those are going to be very very different so we try to gather that information and then be very intentional about designing a program that can be set up to meet those goals uh and sort of like what kind of tactical kpis we
(10:47) should be looking for and then we go down to the second layer of like lit build some tests to help you know reach the metrics you're they're looking to solve so I'm guessing that's you develop the strategy as well um based on like what they need what the goals are what the objectives are where they are in
(11:04) their types um timeline of like or whether whether they established or not and whether they're willing to take the risk that's actually really um interesting because um yeah uh I think like most um smaller scale um businesses would be on the other end where like I don't I I just want my test to run and um I want
(11:26) everything to win while more like larger scale I guess would be like okay I just want to learn and understand the risk some of the factors that's happening on the website or any type of um platform yeah um so in terms of like a a tool or a framework do you guys have a certain framework that you go with um and what
(11:49) are the criterias that you use to select that specific framework yeah for tooling I would say that because we're usually working with very established companies that the technology stack is mostly determined already um and when you get up to a certain size of company they're almost all using Adobe um is just sort of naturally where
(12:14) the market tends to go um and so I think that you know to some degree we'll do some tool influencing but it depends on the the case of like if someone doesn't have a tool then of course we'll start with doing some sort of parameter assessment of like what are they looking for and what are all their criteria and
(12:32) then go you know short list some technologies that would work uh and then on the framework side I think it's like it could be anything like I think we try to create Frameworks for all sorts of use cases and I know there's lots of other great companies uh you know in the space that are doing lots of great work
(12:48) creating Frameworks and so I'm more of a Advocate that it's it's a it's like how you're applying the Frameworks not the Frameworks themselves and and you know it's the same you could say for the tools that there's there's no Silver Bullet framework or technology that's going to drive better results it's kind
(13:04) of like you know with the testing tool we just want a place to put our JavaScript really and true that can be so many of the different tools and and we can really get to a lot of the same outcomes as long as we're smart about what we're doing I think for Frameworks it's the same that we you know they're
(13:19) they're in our toolbox and we can pull on them when it's necessary but um you know I wouldn't say that there's there's some that you have to use or don't have to use uh you know you can make good use of of all of them anything yeah exactly I think it goes down to you know this the strategy as well right okay with the
(13:37) amount of tests that you guys like are working on per month or even per week because it's crazy like with the amount of like clients and testing like especially with a large scale when where you've mentioned you'll have like four or five six or even 10 tests per month how do you prioritize because I'm pretty
(13:59) sure like during the times where you guys are meeting together and IDE ideating all these tests wonderful tests like I'm pretty sure you have like a huge amount of list that you would gather um during that ideation right and yeah how would you um prioritize which ones to test first on on that specific
(14:20) month yeah so the I'd say the first thing we do which is sort of before month period but think at the beginning of a program or when we're designing where can we have the most impact it's really to tackle this from like a mathematical business side which is just where is the money that's being made through this website today and starting
(14:45) with like the prioritization of okay you know there's the homepage is a certain Revenue opportunity the product detail page is a certain Revenue opportunity the cart or the check code or the subscription page and actually calculating those things based on all the past testing we've done so we kind of have baselines of like what you know
(15:04) the impact could be what the minimum detectable effects should be for those different areas um and just prioritize firstly just business impact wise where are we going to be able to you know generate the most revenue for this company and that's that's sort of the first test or first step and then that is determining where to focus
(15:22) on the site let's say the PDP for example and then when we're analyzing that experience now then there's a a whole bunch of ways that we're going to prioritize those and and at conversion we use kind of like a a half human half AI assisted prioritization method so we we've at least this is sort of something we've
(15:47) been developing more lately as the Technologies improve but uh we essentially use like partially a just a classic voting scoring system which is like you know people score ideas based on criteria like Pi or ice or pxl like these other types of scoring criteria and then we combine that with using one
(16:05) the impact data that we have and then second we combine it with all the tags that we use and so we have a internal technology that tracks all of the tagging for experiments and so when we put ideas into a backlog we're also going to tag them with like what kind of psychology principle might we be applying or what kind kind of uh
(16:28) conversion lever are we applying what component are we testing on you know is it B2B or B Toc you know what industry the end all these variables uh help rank it and so we then run all that through a machine learning model to try to have it estimate you know if a certain combination of those things exist can that lead to better you
(16:47) know hypotheses and better test results and I'd say that this is something under development right it's never going to be perfect but we have at A system that can kind of guess whether it's going to be a winner or loser with like 60 to 65% accuracy today and that's that's pretty good like it's better than a human which
(17:06) is going to be closer to 5050 as we've seen in other studies so it I think we're not looking for something that's going to get to perfect because that's impossible but I think you know we're looking for ways of like how can we increase our batting average and prioritization using all of these different things is going to make a
(17:25) difference yeah that that's actually the answer to my my next question which is like do you have like a way to automate any of this testing process because you know sometimes you have to be efficient with these things and I agree with you sometimes with the it can be subjective if it's a human as well because you need
(17:42) to have a certain um criteria to figure out which ones has to be tested first I usually say impact and and effort you know would be the best but then again how would you know Impact versus effort if you don't have those space lines don't have those levers and you don't have all these percentages that you want
(18:00) to um you know um you know see against from and yeah um do you guys have like now I'm you know a bit curious are you guys developing any type of like um automation process on your end that uses AI that would help you like again improve these um prioritization process yeah I'd say that that automation and AI
(18:26) are like two of the one for me like personally interesting areas and then also just for our whole team I think that you know everyone's very bought in on trying to make advancements in this area because I think that we at conversion we like really hate doing routine things that are um you know repetitive and and maybe not providing
(18:47) the most value and so we definitely look at our whole workflow and say you know we have to go through a strategy phase okay and then we have to go through a prioritization and a design phase and a development phase and QA and live and some analysis and that's basically the workflow right and so we then look at
(19:05) okay what could we do in every one of those phases to try to make an improvement either to our ability to do that work faster or can we have a better quality output and using a combination of either AI or automation to try to improve those things is sort of what we're doing so I'd say like on the automation side the most common use case
(19:28) that I can give as an example would just be presentation Generations right it's not AI but it's just like hey we have all the data in our internal system we have a template they pretty much always look the same so why can't we just use the Google Slides API and pass the data through and just autocreate our strategy presentations
(19:50) and our result I love it that's amazing like and that's not even a super complicated one so it's like just the data in the system is the hard part but then hooking up an API to do that is is certainly probably the most widely used form of automation we have and then at least that gets the deck like 90% done
(20:06) obviously it will never be perfect and there's always like every test a little different you want to customize it but at least it gets you most of the way there and now you know rather than spend you know an hour putting a presentation together we can spend 10 minutes on the presentation and 50 minutes you know
(20:24) making the strategy better um so on the on the automation side that's probably the most the most common but we we certainly try to do lots of other things um amazing and then uh to your second question around the AI side I think this is an area that is obviously newer because the advancements have been you
(20:45) know humongous this year uh but I think that we're trying to develop you know prototypes for every one of those stages of the process process and you know some of them are closer to being practical than others but it's kind of like okay well even if we can't make something today maybe we can at least set the
(21:06) groundwork in our data collection to make it so that later that might be helpful and so I I think it's first I would say that I'm I'm pretty Pro AI but I'm also very like pro- human guided work with AI assistance not AI replacing human work and so every time that we're thinking and I'm thinking about
(21:29) designing Solutions or products that can be helpful there it's it's all about how can I integrate that into a human workflow that can be driven and so just a couple of examples of like what I'm would think about is we could say okay if a normal part of an optimization process is to take a web page and
(21:47) perform an analysis of that page and like take a look at it and look for potential conversion barriers right it's very standard activity and we've been tracking that for 10 years in our database and so we have pictures of pages and we have all of the different barrier points that might be there and we have it written in standardized
(22:07) language and so in theory if all of that is there then the question I would post to our team is why can't we have an AI image related view right AI can read the images and we can feed it all of our history of all of the thousands of points we've done and and would it be capable of then identifying those things
(22:31) on a new experience right in theory possible and so those are the kind of that's like an example of something we're trying or maybe that we have you know 10 years of actual raw data from test results and we also pair those with all of the insights that we've uncovered and so in theory if I gave it a new data
(22:51) set and I gave it a model for statistics of like here's how we analyze it and then it looked at how we analyze thousands of past tests you know could it take a first pass at creating an analysis um and so I think like there we certainly have MVPs showing that you know you can get really close with that
(23:14) and it's not reliable enough that it can be used every day but it's certainly a direction that I believe is possible to make a lot of progress on and then you know rather than have a consultant spend you know hours trying to create the presentation and doing the basics it's like well we could maybe process like
(23:32) five or six more interesting segments in that time and get a way deeper analysis out of the same period of time if we can fast forward through some of the things that can be skipped that's actually very interesting yeah but it relies on the data being there and the history of having it to be able to do
(23:48) that yeah yeah wow wow that's actually quite um you've explained it really well that you know it's good to have because you already have all these amount of data and usually what AI does is like it refers back to those historical datas anyway so it's it's good that you're thinking you know wider term in terms of
(24:10) like integrating AI into the process but um have you found any challenges during the process of either integrating or just creating this MVPs on um again like putting that a into your you know analysis part or your design part or the you know creating the the Powerpoints part yeah it's by far what you said
(24:36) which is having quality data is the challenge and so it if it is really like a you know you garbage in garbage out situation that if our tagging architecture has been poor for a period of time we're getting lazy about the way that we're writing things in our analysis and that's what is getting used then our training model is no good and
(24:57) so I probably when our team is building these things we're probably spending 80% of the time is on data quality and only 20% on it is actually building like AI based tools using the AI apis um like that part's actually quite easy it's getting the data and getting all the humans to put the data in consistently that is the challenge yeah
(25:21) so it's kind of funny that the robotic part is easy but it's the the part that um you know has to get everyone to put in the data that that really is is a challenge yeah that's interesting it's good to touch um about AI every now and then because it is a big part now I think moving forward a AI is going to be
(25:42) big in the future but I agree with you that we don't want to have ai to be replacing humans but it's going to be humans guiding being Guided by AIS um to to become more efficient but moving on I just want to know because again like with the amount of testing that you're running um any best practices or any tips that you can um
(26:05) provide the users in terms of or the the listeners in terms of like managing the the test managing the test data how are you guys like making sure um like if in case it's a let's say a government website or something where there's like highrisk and privacy and security um is there any um things that you do or best
(26:28) practices that you do that you can provide some tips for yeah I'd say that this obviously super important and we do work with a lot of large Enterprises and many in like even the fortune 10 and so those companies obviously have very strict data privacy regulations and and we are you know under scrutiny for that
(26:51) regularly and so we certainly get vetted every time we get onboarded and we have to you know hold a certain standard for what we do to handle any data that appears you know in our purview and so we I I'd say that mostly the ways that we're handling this is like it's a lot through restricting access so there are
(27:13) you know cases right where you know it's only on a need to access basis or we have to do very strict archiving rules about like what we store very strict about what we back up and what we don't back up um very strict about using hashes to anonymize access to things um right when we are no longer working with
(27:36) a customer we have to go through protocols to do all that stuff so I'd say it really it varies from company to company what their requirements are but yeah um it's always a good idea you know regardless of of company to to treat the data with a high degree of security and so we we try to do that for for everyone
(27:53) even if they don't ask us that's interesting okay well um I'm just curious as well when you're cuz again like you're conducting a large on a large scale setting are there any challenges like a common challenges or a common challenge that you guys like usually um it's a it's a roadblock when you're conducting
(28:14) this amount of tests so the the biggest roadblock honestly is is the people and the internal politics of the companies we work at by probably a mile I would say that technically there's always some roblocks on the technical side but technical roblocks have answers and once you answer them they're just solved and that's the
(28:42) beauty of the technical roblocks but when we're dealing with large testing programs that are interfacing with probably you know 50 plus employees that is by far more challenging to work with and it's a navigating of you know who's our stakeholder how can we set them up to be successful with the program what kind of
(29:05) programs are going to be on their side that are going to be allies and are going to work with them versus you know who's going to be a detractor and probably going to stop something and um you know what happens if people leave and you know companies change people they change leadership they change priorities they change timelines then
(29:22) you have to onboard them again yeah you have to onboard them there's all sorts changes that happened and and not everyone's going to believe in what we're doing so some people will be anti- testing some people be protesting um and every time you we do something right it's going to be that we'll launch a
(29:38) test that'll be make some team really happy but it's going to make another team angry and so really like juggling that uh is is by far the the biggest challenge at testing at a large scale is is the people it does feel like a tug of war sometimes like yes no you know and then the resources go to from one place
(29:56) to another I yeah I'm just curious how would how would you address those type of um issues like again like you've mentioned time constraints the deadline changes oh okay now we have to launch it this week instead of next week you know budget limitations and then um also again like I just mentioned resource
(30:14) allocations like how do you address that with the stakeholders the the first thing is is putting I think the right person to to help and so I think when when we're hiring for our team and also for anyone who's you know looking for jobs in house like you really do have to be a very skilled negotiator good Persuader you know have
(30:36) very good interpersonal skills and be able to succinctly sort of present your ideas and that's a huge piece of I think what makes someone successful in those roles and then to your question about constraints and budget limitations and resources is is that those are definitely challenges and a lot of
(30:56) companies will think at least they're testing a smaller scale it's like I have those challenges because I'm small and we don't have enough resources but the reality is that those challenges are there at every size company even to the biggest ones in the world they are still suffering from budget limitations
(31:12) resource allocation and time constraints even on the large scale testing program they're still Bud right because like everyone in in the nature of competition is trying to do as much as they can with the resources that they have and and those things never change if you have one person you have a hundred people and
(31:27) so there's always going to be more ideas than you can execute on and so I think like that's really the only solution to this is prioritization right it's it's focusing on that no company is ever going to be able to do all the things they want and so we try to push that narrative that you know prioritization
(31:47) is is the ultimate way to handle challenges with any of those things that we we want to place you know we only might only get to place like five bets each month and we want to make sure that when we're doing that we're putting the Best Bets forward that we can and that's going to come at the expense of other
(32:04) things that we're going to choose to say no to um but then we're going to use data to make good decisions based on those outcomes and you know lead lead us towards better bets in the future and and just trying to convince companies that that's a good philosophy for business is is a huge part of of the job
(32:19) yeah now um I think like I want to know then how would you define a successful testing project on a large scale setting you said that you have kpis and all that but how would you say that it's truly Su successful yeah I'd say that uh honestly the the best indicator at the highest level is just that it leads to more
(32:49) testing I think that's really the simplest explan that that I think that companies often will get tied up in sort of like a bad decision Mak cycle that's around opinion and and gut feeling and really what we're kind of doing is like injecting an anecdote or a antidote into their bloodstream and it's like you know
(33:13) we start small and then we're trying to almost infect like the company with this new attitude of using datadriven decision making as a way to to make better choices and so love job is really about like founding the flame that we start and you know helping our contacts and those teams build influence inside
(33:31) the organization which naturally will lead to more testing and that'll usually start in like maybe just a little marketing program where we're just running a couple tests a month and then that might lead to like the design and product teams being interested and then that might lead to you know the engineering team being interested and uh
(33:48) and then suddenly like more and more of the organization is testing and and that ultimately is is the win that you know those companies are going to perform better because they're going to make better choices on average and then you know it will also create better interfaces and experiences for users
(34:03) which is you know a huge win I agree it's also stops from you know office politics in a way because you have those people like oh this is my subjective opinion or this is like the the general rule or something like that but if you have like a company like youve said influenced with a data um driven um de
(34:19) does data driven decisions then I think it's already win by doing that you know yeah I agree awesome so um thank you so much but I just want to um like probably one of my last questions would be um uh any you know what would your advice be for newcomers in this field yeah I think that uh like the I think I'd first say like
(34:52) ask ask for help and and you know become part of uh you know a community and I think that like what RL and everyone experiment nation is doing is a really great example that there's you know there's slack channels there's LinkedIn groups there's lots of support networks for anyone who really wants to get
(35:10) involved and everyone that I've met in the industry is so nice and and welcoming and willing to share that information and I know I I try to be when I'm asked as well uh and so I think a lot of people are just afraid to ask or afraid to go make those connections and I think that's you know the first
(35:26) piece of advice is just to go do that um and and you know you you won't be let down by that people you'll find someone who will be willing to help you um and then my second piece would just be to to you know follow people on LinkedIn and there's amazing like brilliant people out there that are posting really really
(35:45) interesting stuff like I I've been working uh right 10 years at one of the highest end testing agencies out there and I still I'm on LinkedIn every day and I'm exposed to something where I'm like oh that is so cool and like I literally just bookmarked something like two minutes ago from someone I'm like oh
(36:03) that's such an interesting take I got to go dig into this and uh right I'm still learning so much in that platform and if if I can still be learning after all this time like so much every day that you know that's a huge place to start to fast forward growth for anyone who's starting so I would highly recommend
(36:18) that it's uh there's a lot of bad advice on that platform but there's also some brilliant people if you can figure out who they are yeah that's true I think you'd be able to figure them out if they're watching the experimentations podcast right exactly exactly um any final thoughts you want to talk uh share about the large the the
(36:37) things that you do the large scale testing the automation process and AI I think we pretty much covered it I think just I keep testing and keep trying to to uh you know make it as effective and you know efficient as possible I think that's that's my goal and hope other people in the space are also trying to you know get better
(36:58) quality outcomes and and less effort and you know we keep sharing those together in in the field then we can push the whole industry forward which I think is ultimately important awesome and how can our listeners get in touch with you uh I would be one of the nicest people to follow in LinkedIn I think so there you
(37:14) go that's the place I spend the most time and if you want to shoot me a message then then that's the place to do it awesome so thank you so much Mike for being in our podcast we we are so honored to have you here um yeah so we're going to say goodbye to our listeners and um stay tuned for the next one right thanks for having
(37:34) me this is Rommil Santiago from experiment Nation every week we share interviews with and Conference sessions by our favorite conversion rate optimizers from around the world so if you like this video smash that like button and consider subscribing it helps us a bunch now back to the episode
If you liked this post, sign up for Experiment Nation's newsletter to receive more great interviews like this, memes, editorials, and conference sessions in your inbox: https://bit.ly/3HOKCTK