Harriet Swan 0:00
We all know that humans are very complex and can’t predict how they will behave.
Harriet Swan 0:06
To really want we want to do is be using multiple methodologies in tandem. And these methodologies are very complementary in nature.
Harriet Swan 0:24
Hi, I hope you’re enjoying the conference so far. And thank you so much for taking the time to listen into me. I’m Harriet swan. I’m the UX research manager at conversion, where I lead a growing team of UX researchers working closely with both our experimentation strategy and design teams.
Harriet Swan 0:42
Particularly over the past few years, we’ve really been focused our conversion on evolving our product offering to include other methodologies to be used alongside a B testing, particularly UX research, and found that we’re able to offer much more value to our clients by taking this mixed methods approach. Today, I’d like to share a little bit about the approach we’ve taken and the approach we’ve evolved, as well as a few examples of some of the mixed methods work that we have delivered to our clients.
Harriet Swan 1:09
It’s particularly exciting time to be speaking as recently to experimentation agencies conversion, UK based and wider funnel North American based have recently merged to make a bigger global conversion. And what we’re focused on at conversion is helping businesses make decisions that are rooted in evidence. So there are different types of evidence that are valuable in various contexts. And we found that combining evidence types to make a decision is where we really see the biggest impact. So the two evidence generating activities which were conversion have found to be particularly complementary in our line of work, or a B testing and UX research. So that’s really what I’ll speak to you today. There’s three questions that that I’ll cover the first, what is mixed methods, experimentation, so different definitions out there, and that really speaking to how we see it conversion? Second, speaking to some of the reasons why a mixed methods approach is more impactful than focusing on just one methodology. And then finally, I’ll bring some of this theory to life by speaking about how we have developed this program, as well as some of the examples of application of this work with our clients. The first one is mixed methods experimentation. So when we think of different insights that we can gather about our users, there are two spectrums that we can consider. So one and looking at the Y axis here on the chart is behavioral versus attitude now, so at one end, really understanding what people are doing, and the other what people are saying feeling thinking. The other spectrum here on the x axis is quant and qualitative, to quantitative being answering questions such as how many how much, I’m looking to speak to a representative group of people, often aiming for slapstick, and qualitative being with a much smaller group of people not looking at representation, but really trying to dig deep into the whys and the house, we can then start to plot different types of methodologies that we can use to gather these insights on this chart. Obviously, there’s many more methodologies out there. But really, this chart focuses on showcasing the main ones that we are converting or using. So you can see at the top left, you’ve got a B testing and analytics, very behavioral in nature, looking to understand what people do. And we when we make certain changes how that impacts behavioral metrics, and quantitative in nature. So looking big Sam, at stake, and really having confidence that, on the other hand, bottom right, and methodologies such as diary studies, and contextual interviews, where we’re really looking to understand those deeper insights and the user context and motivations and mental models. If we focus on activities, just on the top left, we’re getting a really strong sense of what users are doing. And we have very high confidence in that behavior. However, we don’t know the why behind this. So we can make educated guesses we can make informed guesses. But we’re limited and really knowing from the user’s perspective why that behavior occurred, and therefore what we should do next. On the other hand, if we’re focusing solely on activities in the bottom right, we’re getting really deep insights about users and their motivations. But we aren’t able to validate those and know if those attitudes translate into behaviors.
Rommil Santiago 4:32
This is Rommil Santiago from experiment nation. Every week we share interviews with and conference sessions by our favorite conversion rate optimizers from around the world. So if you liked this video, smash that like button and consider subscribing it helps us a bunch. Now back to the episode
Harriet Swan 4:45
and we all know that humans are very complex and can’t predict how they will behave. So really, what we want to do is be using multiple methodologies in tandem and these methodologies are very complementary in nature. So generating insights across these different quadrants and analyzing them in tandem, and do this broader and deeper understanding. There’s many different ways that we can combine them. And here is just one visualization of how we could use them together. So for example, measuring the impact on behavior of a certain change, also understanding the experience from a user perspective to identify areas of opportunity areas of friction and help explain the AB test results that we might observe. And then finally, always keeping the deeper audience context in mind whenever we’re interpreting these findings that we say. So moving on to speak to why should I apply a mixed methods approach, and there’s many benefits to this, but I really want to focus in on three main ways that I feel UX research can bring additional value to an experimentation program. The first is to help us have more confidence in the decisions that we’re making. one data point in isolation can be limiting. But if we’re able to look across different sources to validate insights, and see the same thing in different places, as well as having the ability to answer a question from different perspectives, we’re able to have much more confidence in the decision we’re making, and also lower the risk to that decision. The second way that UX research can really bring value is it allows us to answer many more questions than we could through just conducting AV tests. So we can answer generative questions where we’re really understanding the broader motivations, the way that people think about a certain category, the jobs that they’re looking to do in a certain experience, as well as some evaluative questions. So understanding and really getting into the shoes of users to get their point of view and an experience and how it could really be improved to them. And then, lastly, but probably most importantly, we’re able to create more business value. So many people think about experimentation, as conversion. And it’s shifting metrics, which of course, is important, of course, is integral. But if we’re able to take a mixed methods approach, and understand not just the what, but the why we’re able to build a lot more depth into our thinking and our understanding. So we’re able to support much more strategic decisions, we’re able to have a much deeper understanding of our customers and subgroups within that. And we’re also able to take bigger swings and be more innovative and take bigger risks without having the downsides.
Harriet Swan 7:38
So in this last section, I’d like to illustrate some of this theory that I’ve talked through and discuss how a mixed methods program can be started or developed based on the experience that I’ve had in supporting this evolution at conversion. Firstly, to incorporate and get buy in for UX research, often a shift in mindset needs to occur. This can be a challenge, definitely a challenge that I have faced to get this by, and particularly in environments that are very focused on metrics and quantitative methodologies. But it is really critical to ensure that what you’re developing can really be as valuable and impactful as low as it can be. And within this, there’s two main shifts that need to occur. So the first is shifting from thinking whether UX research is something that should be conducted to creating a culture where UX research is an ongoing activity. So with most of our clients now at conversion, we’re conducting UX research alongside a B testing on a monthly basis. And what this means and having this cadence allows us to be constantly iterating constantly innovating and truly applying both methodologies in tandem to be driving improvements. The second shift is from focusing on what can be optimized and experience and where we should optimize to taking a step back and taking a step out to start with what our key business questions are. And then aligning the methodology or methodologies that will best help answer that question. Once there’s been this shift in mindset, the next step is application. So there’s different ways UX research can be applied, and the sequence between UX research and a B testing. And choosing the best application of these is again linked to starting with a business decision, we want to inform and then aligning the approach we take with that. So I now want to talk through the main context where we’re using next methods with a real life example for each. So the first context where we might apply UX research is to help explain a result we’ve observed in an AB test. So we might have run a test and seen a result we didn’t expect or just really want to dig into the deeper why behind that behavior we’ve observed This is one of my favorite examples to speak to. As for me, this is one of the most impactful mixed methods, pieces of work that I’ve worked on. And the images here are just for illustration. But these are all real life examples that I’ll speak to of work we’ve conducted. So this is a piece of work that we did for one of our clients who’s a leading global technology corporation selling hardware and software solutions. So although this visual shows shoes, and this is a company who are who are focused really on selling laptops, and at the beginning of COVID, one of the key stakeholders over there, said, you know, people are now working from home. And that’s a new context now. So on our PLP I want to repeat is ot focused imagery, which you can see in the control on the left hand side here with lifestyle imagery, because that will really resonate. So a fair question to ask and a fair assumption to make maybe. And but luckily, our team and the experimentation team over there got wind of it and said, rather than just make this change, how about we test it out. And so we ran a series of A B tests where we tried different variations of this lifestyle imagery. And lucky that we tested it, because every single time, the variant completely tanked in terms of conversions. But we weren’t really sure why. So we decided to run a UX research study where we split our group of users into two groups. And so one where we had them go through the control experience, and one the variance to really minimize any bias step. And this was not, this isn’t always the case. But this is a very nice example of where the research completely backs the AV test. So the control performed much better for everyone compared to the variant. And there are a couple of reasons for this. So the first thing that actually now that people were working from home and unable to go into stores, they really wanted that store experience through a website. So they actually wanted to get closer to the product, they wanted to get a sense of the look and feel, and really try and mimic that store experience as much as possible. And so they didn’t want to see lifestyle imagery where the product is much smaller, and they can’t really get a sense of what it would be like in their hands. And the second was the type of lifestyle imagery that was being used was not resonating with their work from home context. So more stock, beautiful marketing images, and really not resonating with so many people who actually just working in a messy corner of their bedroom. So really hearing the whys that explain that result, we saw an AV test then helped us know what we could do next. So following this, we’re able to then actually really focus in on the product and showcasing that even more and even closer. So really pivoted in terms of our approach. The second context is where we can use these methodologies in opposite directions. So starting with you, X research to inform test hypotheses, and then a B tests. So many examples that can be given here. And we and the kind of example that are shakers here is, you know, if you’ve got a PDP, obviously, there’s so many different tests, so many different hypotheses that you can have on a page like that. And we can of course, use our expertise. We can use different frameworks, we might have to help us prioritize which hypotheses we think it’s actually to have the best result that really having user feedback and having users go through the PDP experience to really understand for them, what is most important to them, what are the biggest questions that they need a PDP to answer can help us really focus in on what we might want to prioritize and where we might have the biggest results. The third context is really helping inform a big strategic decision. And increasingly, we’re actually using these two methodologies to gather at the same time. So we’re able to then analyze both together and look at both the quant and the qual insights in tandem to really inform this decision. So the example I want to speak to here is a global kitchen and household appliance manufacturer that we work with. And they wanted to change their search functionality. And they’d been working with three different potential vendors who had different ways of building this search functionality. The rather than just pick a vendor based on you know, opinion or or creds, they wanted to test out the different experiences and see actually, which experience offers the best for users. So each of these experiences were built out by the vendors. And we ran an A B test to understand which experience perform better in terms of behavioral metrics for users. But at the same time, we also ran a UX research study where we had users go through the different experiences to help explain the A B test result that we observed and to also identify broader direction and optimize it decisions that the client could take regardless of which direction that they went in. Finally, using UX research earlier on in the design process to iterate and improve is another context where we find it really valuable. So before, we’re putting in the time and the resources to develop an experience, using user research as a way of getting that design as far along as possible. So an example of this is a company we work with who offer storage solutions with one of their product offerings being based in the garage. So they wanted to create a wizard on their site. So it wasn’t something that already existed, to really understand from users at a few different questions to capture what they were looking for, and what their garage was like so that they could then offer customized recommendations to them. So they asked us to take this on. And we designed a couple of different versions of this wizard and what what it could look like. It’s a very low fidelity, as you can see from the images, and we were able to then put these different versions in front of users to understand what is the best way of asking certain questions. So for example, when people are thinking about the layout of their garage, are they thinking about it in terms of shape? Are they thinking about it in terms of how many cars that can fit? And really helped us ensure that the questions we were asking made sense, they were clear, and they were also relevant to users. So we’re able to iterate on those designs before they were then move forward and designed out and put on the live site. So to summarize, I’ve tried to answer three questions today, the first being what is mixed methods experimentation. And this is really the application and combination of different types of evidence generating activities to inform decision making. In terms of why do we feel we should take this next methods approach? Well, it allows us to make much more confident decisions and de risk these decisions. It allows us to answer a much broader range of questions than we can through just taking an AV testing approach, and also really creates much more value for our business and really gain a much deeper level of thinking and understanding. And then finally, in terms of the How to this really starts with a shift in mindset. So a shift from thinking, Should I do it to watch should I be, should I be researching?
Harriet Swan 17:26
And then also a shift from focusing on optimization and where to optimize to really starting with the business decision, and then aligning the best methods or methodologies that will best help you answer that decision? And there’s, as I’ve spoken about different applications of that. So then aligning how we might sequence those different methodologies and which ones to really focus in on to best answer and give the evidence that you need to make that decision. So thank you so much for listening. And I really love to hear any thoughts or questions that you might have, whether you’re thinking about perhaps incorporating UX research into your experimentation program, or perhaps you’re already taking a mixed methods approach, and we’re open to discussing different ways that we can apply that. So please do reach out. We’d love to hear from anyone and enjoy the rest of the conference.
If you liked this post, signup for Experiment Nation’s newsletter to receive more great interviews like this, memes, editorials, and conference sessions in your inbox: https://bit.ly/3HOKCTK
Connect with Experimenters from around the world
We’ll highlight our latest members throughout our site, shout them out on LinkedIn, and for those who are interested, include them in an upcoming profile feature on our site.
- Adventures in Experimentation (9)
- Analysis (2)
- Announcements (48)
- Ask the Experiment Nation (11)
- Certified CROs (193)
- Community (2)
- Conversation Starters (2)
- CR-No (5)
- CRO Handbook (4)
- CRO Memes (18)
- Experiment Nation: The Conference RELOADED (41)
- Experimentation from around the world (2)
- For LinkedIn (121)
- Frameworks (2)
- Growth (14)
- ICYMI (2)
- Interviews with Experimenters (167)
- Management (1)
- Marketing (11)
- Opinion (8)
- Podcasts with Experimenters (15)
- Point/Counterpoint (1)
- Product Experimentation (5)
- Product Management (9)
- Profile (9)
- Question of the week (5)
- Sponsored by Experiment Nation (1)
- The Craft of Experimentation (1)
- The Cultures of Experimentation (1)
- The weekly buzz (13)
- The year in review (1)
- Uncategorized (352)
- Weekly Newsletter (183)
- Where I Started (2)
- Making the case for data-backed design featuring Brian Massey - September 30, 2023
- Announcing the 2023 Conference Awards - September 24, 2023
- How to Maximize Hypotheses Testing ft. Eduardo Marconi Pinheiro Lima - September 9, 2023