Navigating App Optimization with Ekaterina Gamsriegler

AI-Generated Summary

Ekaterina (Shpadareva) Gamsriegler, a seasoned growth and marketing expert, shared her insights on mobile app experimentation. Here are 5 key takeaways: Mobile experimentation is more complex: Release cycles are longer, and dependencies on app versions can slow down learning. Think beyond the app: Experimentation should encompass ads, store listing pages, and the app itself. Borrow insights from other areas: Use data from paid advertising to inform store listing optimizations. Interconnectivity is key: Changes in one area can impact others. Strong stakeholder management is crucial. Document everything: Detailed documentation helps track results and avoid surprises.

AI-Generated Transcript

(00:00) so this uh does work and regarding the like mobile specific challenges um I think like the first one is really connected with uh what I mentioned before like with the longer release cycles and with the dependency like on the particular app version to go live before you can uh start getting the the insights and the numbers uh so um

(00:22) there there are some solutions to that um hey cathina welcome to the show hi Tracy pleasure to be here yes so happy to have you here we've been I know we've been kind of talking back and forth trying to get on this call for a while but I'm so glad you're here I read your bio and a lot of what you do is very relevant to a

(00:49) lot of the people in this community so I know they're going to learn a lot from you and yeah it's great it's great for you to be here yeah I hope so I hope we can share something valuable I know you will I know you will but yeah if uh you can let us know a bit about yourself like who is a cina what brought you to

(01:06) where you are today just give us the whole rundown and uh yeah we'd love to know more about you yeah um I'm a Kina and I've been working in grow and marketing for for about 15 years um by now um we be a little universary soon I started in search engine optimization and then moved to to digital marketing like a little bit more

(01:30) all-encompassing field and then from there the mobile marketing was actually picking up at that time and I wanted to to see how this works as well so about seven eight years ago I moved to mobile marketing and then I was switching between digital and mobile here and there but mostly sticking to mobile apps

(01:49) and yes during this time I've been working across all the parts of the funnel I was working for in user acquisition um abster optimization CRM also be being responsible for the creative production of advertising assets and um also did product management a bit in recent years I'm mostly leading uh the marketing and

(02:14) growth teams uh but yeah have worked on on the whole funnel pretty much during my during my career that's great it sounds like you went from specialist to generalist back to specialist again again I still think um like currently I'm trying to be more of a generalist but definitely still a lot of uh work

(02:35) that is happening Hands-On um I I I would say I'm still fairly tactical um but U yeah specialist and horizontal growth to in recent years a bit of a vertical growth as well yeah well it sounds like now your focus is kind of in mobile app optimization and experimentation why did you decide to go in that

(03:03) direction yeah well mobile um at first was really attractive to me because um I think the attribution was very very straightforward and clear and it was just fascinating how you could track every s very reliably and how you could look at the user behavior um from acquisition to conversion to whatever

(03:23) renewal um in the in the so much detail so this was the level of granularity that I did not see um on the web back then um in recent years and uh yeah overall experimentation has been an like super integral part of each of the roles that I've had in the past because I would really split it into multiple

(03:46) buckets you can experiment a lot with on the acquisition side um you can experiment um in with every email and with every email Journey that you're creating as well because it's like a funnel of its own and on top of this you can additionally experiment in the product and uh with a bunch of features

(04:03) um screens um in the app to try to improve the user behavior and to deliver more value so experiments have just been uh there in every role yeah that's so interesting I've never uh tested on an app before so just the whole thought of going from web experimentation to mobile experimentation it's just such a blank

(04:27) it's like a blank page for me I just I have no idea what to even expect with a mobile experiment is the process for experimentation any different it sounds like attribution is way different but what about the actual test design the actual testing process how does that differ between optimizing or experimenting on a website for

(04:50) example I think that the process which is like for me feels like a way more critical part to nail down and to get it right like from the hypothesis um generation to um actually measuring the results analyzing the results and sharing those I think this is I would say it's fairly similar to um to the web

(05:10) as well to the web experiments you still do a lot of research um user research then the setup itself will differ because different platforms different tools different is DEC case um like this is quite a change um maybe on the ACT releasing the experiment and like rolling it out um there is quite a change as well that should be mentioned

(05:34) because uh if you have a native app uh you tend to depend on your app developers um to write the code and to implement um a certain experiment and not only this like with web you can deploy a lot of changes like fairly fast um if you have a buckle as well um you catch it uh you make a change um it goes

(05:55) to production then you um the new web version is live uh with the apps it's a little bit more complex uh than that because not only the developers have to implement a certain experiment then it also the new app version has to get released it has to go through the approval in the stores and then the downloads like users should start

(06:16) downloading this new app version where the experiment is live so this naturally takes um way longer um and if you work in like weekly or bi-weekly release Cycles then um and especially if you have a bug at some point then you should always keep in mind that okay it might actually take another Sprint to um to

(06:35) get rid of that and to find like actual relastic experiment um I think this part is the one where um it differs significantly so it requires way more like way better road map planning and um there are tools that can help you streamline this process these days but not all of the experiments and like only

(06:55) certain types of those yeah I think this this is um this is the main difference probably hi this is Romo Santiago from experiment Nation if you'd like to connect with hundreds of experimenters from around the world consider joining our slack Channel you can find the link in the description now back to the

(07:09) episode do you also experiment on the App Store Pages as well yeah for sure so this would be like I would really split um the experimentation into three buckets more or less like the first one would be the advertising type of experiments like these are roughly the same on uh web and mobile you test different channels you

(07:33) test different Creative Concepts different targeting optimization setups of course um the channels and the types of campaigns are different but the idea is the same um you want to drive users either to your store Page or to the web as cheap as possible as as effectively as possible then the second bucket would

(07:52) be pretty much what you mentioned which is uh store listing Pages uh this is everything like pre-download or on the web it would probably be similar to pre- signup experience like when you also create U different landing pages U redesign the homepage or lead magnets and uh you drive users there um to see

(08:13) how the actual page performs so it's similar to the store listing optimization where where we optimize the visuals um experiment with um like different copy icons feature Graphics Etc so this would be the pre- download like pre-sign up stage and the last one would be yeah the experiments in the actual

(08:37) product awesome I didn't even consider ads it was so silly because you would do that for a website as well but I didn't see it as these three different distinct areas to optimize the ad the listing page and the app itself how much would you say say you focus on kind of each part of those yeah the good thing is

(09:01) that you have uh typically different teams and people focusing on each of these parts simultaneously because it's just critical to um to make sure that not only your product is retaining converting and users are getting value out of it on top of this you also need to make sure that you are at any given

(09:24) moment driving users to your product as effectively as possible and um ideal as cheap as possible I mean qu you need of course like quality users so not cheap at all cost but get you get what I mean so um I would say that this is um usually happening simultaneously and not every like person in the team not every

(09:46) team will be focusing on this experimentation 100% of um of the time they will always be the core um like work that needs to be done as well uh but in any given moment you will have experiments running on both um sides like you will have like maybe new advertising Concepts tested in parallel you'll have a new advertising Channel

(10:07) being tested and at the same time you might have a new um a new I don't know purchase screen in the app being tested as well I just it's it's I'm so curious about the types of experiments that get run on apps and also on the store listing page I know that you mentioned um images creative like what are some of

(10:28) those experiments that you're most proud of when it comes to uh the app experimentation itself and like listing Pages yes so at the top of the funnel um I would say that it's mostly my team members that are currently experimenting with this like on the advertising as well as abster optimization side because

(10:52) um I have the team members that Focus specifically on this and this is still super interesting to see um the results that you don't always expect sometimes even the super minor changes can U bring um bring something interesting like there is um one of the examples that um I like sharing is when we changed the screenshots um we

(11:18) experiment with the screenshots quite a lot and we um at some point uh placed uh the names of the programming languages on the on one of them so there is this concept of the first impression frame um this are the first um like roughly two and a half screenshots that you see because these are the ones that most of

(11:38) the users get exposed to and these are the ones that are the most important so we placed the names of the programming languages that our app teaches uh on the second one and it gave us a boost of um like 50% in conversion to download because um the Assumption back then was uh it it will be still apart from on like in

(11:59) addition to writing this in the title and description we should also just um stress this visually and uh make it easier for users to process like what the app is actually doing and it worked and at the same time um there were some experiments which also brought good results but was completely unexpected

(12:21) like when we were for a very long time uh having super colorful backgrounds in the screenshots um because ey catchy and real looking nice and fun um and uh at some point we decided to change it to a super dark background I think the it's called um space blue that's my designer keeps reminding me of that so um we

(12:48) changed it to that and in addition we also added some lines of coat in the background like very blurry and it also worked really really well so the Assumption uh again that we had was that programming is just more Associated was the dark background uh and that's why we again can add a bit to this visual

(13:09) recognition with like a little bit of um coding lines running there uh which performed much better than the colorful ones even though if you would look at the education vertical overall you would probably notice that they're they're going really after bright colors and this works for them yeah maybe in the

(13:27) process of trying to be so eye-catching everybody ends up looking the same so then when you actually have like a bit of a more muted or minimal concept it stands out do you think that there's maybe something to be said about that yeah for sure this might be something uh there too because um this is the the tricky part you might

(13:51) have your initial hypothesis right and we might think yeah it probably worked out because of that but there might be the hidden variables or the reasons why like it worked out for a completely different reason but just because the results match uh what we have envisioned for this experiment we just stick to

(14:07) this but I think there is a lot to uh to what you're saying yeah there is a lot to what you're saying because we do look at the competition and uh we see what kind of things they're experimenting with at the same time you can never be certain that each of us knows what we're doing and it might be just just a random

(14:27) experiment that is not going to work out so copying um like the competition blindly um just because the whole vertical looks this way might be something to it but doesn't have to to be the win in case yeah I'm actually curious then because you did bring up research for for listing Pages specifically how do you get qualitative

(14:49) data like I assume you can't put heat maps on the page or run survey on the listing page is that correct yeah that's correct you cannot do that I think the the only like one of the best ways uh is to um is to just talk to users to show them uh potentially like your store listing page or some variations of it and to dig a

(15:14) bit deeper into what resonates and what doesn't it's a little bit of a leading kind of tactic and strategy because it doesn't it's not exactly how users really make a decision about downloading in the app right so they can um of course um talk about it and they they can provide various Arguments for for

(15:33) their decision um at the same time it doesn't mean that in the wild this is exactly how the user is going to sync and this is exactly what they're going to do so in my experience uh there is um it's very tricky to get the qualitative insights for uh this level of testing like at the very top of the funnel

(15:54) that's why uh just regular AB testing makes makes it much easier um in terms of coming up like we have the hypothesis we just launch an AB test and we see which um AB or C um like test and we see which version performs better um so um there a quantitive insights but um yeah getting the real insights from users in

(16:18) terms of what do you like don't like or why would you down would you download this app or not like this this is this kind of research is a little bit Tri yeah yeah yeah and there is something to be said I think about doing research in the places where you can get high quality and comprehensive research and then trying to adapt it to

(16:42) the places where you maybe can't do as rigorous research I do think that there is value in that and it sounds like that's kind of your approach um when it comes to listing Pages for example and and you called out a couple challenges there too and I'm curious what are some challenges that are specific to mobile

(17:03) experimentation or mobile app experimentation that you don't necessarily see in web experimentation for example yeah I'll answer the first one in terms of you can you can also like what I think works pretty good is for example testing different value propositions and uh testing different copy on the

(17:28) screenshots and what you where you can get a lot of inspiration for this from is your paid advertising because in paid advertising you have a lot of different concepts um and um like hooks um and copy tested all the time so uh very often what we do is if we see if we see an adet uh there is performing um like

(17:49) really well and we see that this copy probably like this particular hook resonates with the audience then that's also something that you can move onto your screenshots and see if this is going to be the case like for a wider audience as well that gets exposed to your store listing so yet there are ways

(18:06) of how you can borrow the inside from one like other bucket of experiments and try to uh to apply it to the other one so this uh does work and regarding the like mobile specific challenges I think like the first one is really connected with uh what I mentioned before like with the longer release cycles and with

(18:28) the dependency like on the particular app version to go live before you can uh start getting the the insights and the numbers uh so the there are some solutions to that uh because in recent years there are tools that can help you that can help you run experiments without necessarily waiting for the app

(18:49) version to be updated um they mostly cover the pricing and the purchase screen like payall exper ments but this is still a huge um relief like can be a huge relief for for you because um this way you first of all you save your developers um time and your product designers time that do not necessarily

(19:10) need to uh work on these features in addition you can Empower your marketing or grow team members to be able to to be a to be independent when it comes to launching these kind of experiments analyzing them again without involving anyone else uh so this basically helps you iterate much faster so within a

(19:31) given time frame you will probably not necessarily have that many more wins but you will have way more learnings and um this uh this is the key the key to success Yeah Yeah well yeah you have to learn from failure so of course yeah exactly so but it's really helps to speed up a lot of things and this is

(19:53) critical for startups like for small um for small developers yeah and um another area which is kind of challenging I would say and which is for me slightly different from how it works on the web is that a lot of things in the apps are way more interconnected and like in on the web you might have a much clearer

(20:15) split between like this is what marketing does this is then what the sales team does and this is what the product team does or like a mix of sales and product depending on uh how you grow and um if you're like sales Le or product Le company but still typically there is this point which is maybe a lead um and cost per lead where the

(20:36) marketing realm kind of ends and uh in the apps I would say that marketing teams tend to be responsible for um for revenue and there are so many um variables that determine that which means if somebody like if one of the team teams like the core product team makes a change um like to the activation flow and uh end of on boarding is where

(21:02) we show our pay well we will probably see the impact on uh the trial optin rates and if um another uh Team decides to maybe remove a feature or experiments with retention and we see a drop in that then we also can expect that we will probably see a higher subscriber turn at some point so this things um I think are

(21:26) way tighter connected on mobile which uh leads to have to be really good at stakeholder Management in order to again on any given day be aware of what's happening in the product um on multiple in multiple parts of the product I would even say like what experiments are running here and there do this experimen

(21:50) in on boarding um can they impact us negatively if so like how much are we aware of this have we accounted for this in our like paid advertising span um and yeah this um this is like a lot of juggling between like making sure that uh everything that's um running um like you're aware of that so that there are

(22:12) no um like surprises in the end and sometimes it's also really hard if you're looking at some cohorts like several months after it's also pretty hard to necessarily attribute um like some changes in the bottom line met to a certain experiment because it kind of creates a waterfall of subsequent changes and impacts any given experiment

(22:34) can impact like two to three core metrics and that's why it's super critical for us it's good that we work um like a synchronously at Mimo we document everything very thoroughly so when pitching an experiment and before launching it we always come up with primary and secondary success metrics that we expect to um the experiment to

(22:55) move and also tradeoffs and guard rails and then when we analyzing the the results we also document this very thoroughly to see um like how much it matches the like the expectations and it's a lot about yes sinking and sharing as much context on the metrix moving um as much as possible we have the bi-weekly bulletins

(23:17) or like bi-weekly digests so that every team knows what everybody's working on like what every team is working on we have shared road maps we regularly sync like with the teams with the PMs to to just be aware and have as much context as possible yeah and that's super important because I'm pretty sure most

(23:40) if not all of the people listening to this have been burned by a miscommunication that ends up breaking an experiment or ends up completely Shifting the results in a way that everybody's like what happened here oh it's because we didn't communicate there was a release that went out or I see it in e-commerce with promotions they

(24:03) always mess up a test results so how do we get around that communication stakeholder management and it's a good thing we have people like you around to take care of all that and see all those moving pieces yeah I mean uh it's still not always working out perfectly right but um I think it's important that just

(24:22) every team member is trying is just sharing um as much context as POS possible in advance uh because yeah my um like typical setups like not in the current company but the examples I've seen before were like yeah that's like the tri opin has I don't know dropped in half and this is because we are testing

(24:42) a new pay wall or uh the tropin dropped by 30% and this is because we removed like one of the tiny bully points in the list of pro features that we are selling because we didn't think that it matters but but it actually does apparently does right yeah it's always it's always interesting finding that out or finding

(25:04) out kind of the reverse this this thing that we thought was really important is actually not important at all I see I see that a lot in especially Ecommerce yeah I I see this like very often in life cycle marketing it's also because I'm experimenting a lot with that and I love it I love like life cycle marketing really it's like like so

(25:27) many things you can optimize there are first of all like there are different like flows there are different Journeys within every Journey there's like the timing then there is the actual emails or push notifications then there is the copy headline do I put an emoji there or should I address a user by name right

(25:45) and it's like so easy to get super deep into it and to be running like I don't know 10 15 experiments at the time yeah and what I recently started doing is uh to actually like run holdout tests to see if um having this whole journey disabled like having several of the messages disabled make any difference

(26:07) and surprisingly and sadly um sometime like it just doesn't and this doesn't mean that it doesn't work and the life cycle marketing doesn't work it just means that clearly I'm not delivering as much value through it as I was hoping I was and this means okay there's so much more work to be done here from the

(26:28) ground up because um obviously it could be working much better yeah I think there's so much to be said about that because personally we're bombarded by so many notifications all day every day but as the marketer or the experimenter creating those messages we think it's so important we think oh this is so

(26:49) valuable but then we find out oh it's actually not the visitor may actually just want less but have that lesser be more valuable so quality over quantity it's like such a trade-off I I had a passed in life cycle as well and there was that that question of okay so there's an email flow do we send more messages or do we send fewer

(27:13) but make sure they're value packed and and hopefully find out get answers to that so I love I love this maybe I'm going to go specialize in Mobile uh app optimization now who knows but if someone did want to get started in specializing in mobile app optimization or experimentation where would you recommend they go first to get

(27:37) started that's a good question um I mean I don't think anything's going to be practice what I see people doing even if they're not experts on mobile but they just take over like a side project um a new client that is more mobile first and they dig into it and they involve like maybe some external help mentors to

(28:01) advise them a bit like to help them um to guide them a little bit better I don't think that before actually me switching to mobile marketing to begin with I had zero experience with it it was because the the whole like vertical was just um appearing back then and it was um very early days but honestly I'm

(28:22) not sure if if there are like mobile first mobile specific um kind of experimentation resources that um that exist I do believe you can of course like read the technical documentation for I don't know running running experiments and certain tools or analyzing them but this probably like this is super easy to catch up when

(28:47) you're already you're already doing this as well uh so honestly I don't think I even have a good answer to this question no it's good that's all good cuz a lot of us tend to just fall into experimentation by learning one thing and then just falling into it so if mobile app experimentation is kind of

(29:09) the same thing I think that's that's a good thing for getting people to explore it and not having too many barriers to get into it is there yeah sorry go ahead no I would say like product wise for sure um I think you can always extrapolate um like your experience with the web experiments like with the web

(29:29) experimentation into Mobile um yeah I think uh with the uh product experimentation was different front screens flows onboarding activation retention features like this is where the exper experience can be extrapolated from web to Mobile so if you have experimented with things on the product side um think there is not that much

(29:51) different U apart from the text te that you will find in Mobile um it's similar with advertising um right like advertising bucket probably uh the only uh part that is different is web experiments that you run like on the landing pages on the website um homage Etc versus um store listing experimentation so this part is slightly

(30:15) different but also I believe that the tools that we have this days for mobile um like the stores themselves um this is super straightforward and fairly easy to to pick up as well nice well thank you so much for kind of giving a different look at a part of experimentation that at least I have never seen before or

(30:37) experienced and I'm sure many of the people listening also kind of feel the same way is there um yeah is there anything going on in your world right now that you want our listeners to know about not much going on but I would say that I'm I really like figuring things out and um if anyone has an issue

(30:58) challenge problem that they're trying to solve especially with mobile and don't hesitate to reach out on LinkedIn I'm trying to be active there and just share it and I'll try to help so I'll be happy to connect and help you figure things out great we'll put your LinkedIn handle inside the show notes for this episode

(31:21) but thank you so much Aina like this has been a very insightful chat and I'm really glad you are on and thank you so much for joining well thank you for having me it was a pleasure hi this is Romo Santiago from experiment Nation if you'd like to connect with hundreds of experimenters from around the world

(31:38) consider joining our slack Channel you can find the link in the description now back to the episode

If you liked this post, sign up for Experiment Nation's newsletter to receive more great interviews like this, memes, editorials, and conference sessions in your inbox: https://bit.ly/3HOKCTK

Previous
Previous

About our new Shorts series: CRO TMI

Next
Next

Building Your CRO Brand with Tracy Laranjo