AI-Generated Summary
- The hero section at the top of a website is crucial, as most users don’t scroll beyond it. It must convey a clear and compelling message with the value proposition to capture the user effectively.
- Urgency and scarcity tactics, such as countdown timers, can be powerful motivators for conversion, but users have become more skeptical over time due to their overuse and manipulation in online marketing.
- Integrating heat maps, user recordings, and user polls at specific key segments of the user journey can provide a comprehensive and data-driven understanding of user behavior, bridging the gap between qualitative and quantitative insights for effective optimization.
- When optimizing websites, it’s crucial to combine both qualitative and quantitative data. Heat mapping, analytics, and other insights help paint a complete picture of user behavior and preferences, allowing for more effective website improvements.
- When analyzing heat mapping data and conducting A/B tests, it’s important to consider the sample size of sessions recorded, ensuring it’s large enough to draw meaningful insights. Have a clear hypothesis in mind before diving into the data, and avoid overwhelming users with too many clickable elements on a page. The heat mapping data may not always pinpoint clicks accurately, especially on mobile devices, so it’s important to interpret it with some caution and not make hasty assumptions based on every click recorded.
Video
Audio
Listen to the episode on Spotify here.
AI-Generated Transcript
Deborah O’Malley 0:00
For the most part, the hero section, the top section of the website before, its new user has to scroll down the above the fold section.
Most users do not go beyond the hero section on the homepage. So the hero section is absolutely your prime piece of real estate, to capture the user. And as a result, the message has to be saline, it has to be very clear, has to be compelling. It has to contain your value proposition has to explain what your product or services, and it has some benefit. And that’s an awful lot I can do here a section. But most companies do not do a very well. They assume that users know what the product or services and they offer no background whatsoever on what the product is services. And they have some cutesy phrase like, you know, whatever it is, that doesn’t mean anything to the user. And then they expect to use it to convert and wonder why conversions are left?
Richard Joe 0:59
A folks it’s Richard here from experiments nation Podcast. Today I got a special guest. It’s Deborah O’Malley from guess the test,
convert experts. And she’s an adjunct professor at Queen’s University to bits got a background in mobile master’s from Queen’s University, specializing in eye tracking technology and behavioral psychology. So she’s got really deep insights that can help us out with qualitative data. And yeah, it’s really good to have on the show. Hi. Hi, Debra. Hi, there.
Deborah O’Malley 1:34
Yeah, it’s great to be on the show. Thanks so much for having me.
Richard Joe 1:37
No worries. So yeah, you’ve got a an interesting background, click in particular, I’ve seen you on your segments on heat maps and recordings. So your background and knowledge and eye tracking and the human psychology with, you know, how they interact with websites, and so forth, and how that plays into, you know, things like heat maps and recordings, and maybe even wrote user tests. And I think it’s definitely advantageous thing, and I think our audiences would really enjoy listening to, you know, to your background experiences. So, yeah, can you could you tell our audience, our audience a bit more about your academic background? And how you, you know, how you why you chose your degree, how you get into that, and how you found the link from finishing your degree to get into CRR? And so forth?
Deborah O’Malley 2:27
Sure. Yeah, absolutely. So my story actually stems a long time ago, back to when I was in grade school. And it was in about grade four, so and we had to choose a science experiment. And we talked all about a hypothesis and formulating a hypothesis. And then we had to choose a topic to settle on. And while other kids were choosing, you know, what kind of soil is most absorbent? Or what kind of diapers absorbed the most apple juice? Yeah, I was really interested in seeing what people saw first, when I cut out different colors and different shapes of construction paper, and posted them on a board. And I would ask people, I’d say, what do you see first, and that’s what I was interested in learning about. And fast forward, many years later, and a very circuitous route. I ended up pursuing my master’s with a specialization in eye tracking technology, which is not all that far from heatmaps. And I did two pretty major projects and eye tracking technology, where I looked at what people saw first in advertisements. They were print ads, but they were on a digital screen. And I found really interesting findings. I had one project that looked at do people recall positive or negative messages more? And what do people pay more attention to. And I actually found that people initially pay more attention to negative messages. And then as they start to cognitively process, that negative message, they sort of reject it. And they don’t actually recall it, as well as a positive message. So positive message, or I use a theory called message framing, gain frame messages are actually better for getting people to recall and remember the message later. So that was one really interesting finding. And I also looked at, do people pay more attention to a text only add an image text add, like what we see in most ads, or an image only add? And interestingly, this might shave, you know, marketers advertisement strategies on the future. Yeah, people actually pay more attention to a text only ad. They take more time to read it, and they recall it better. So despite all those ads that we see out there that are text and image, text only actually maybe is the better way to go if that text message really resonates with people. So that was my masters. When I graduated from my master’s, I thought for sure I would go out into the world and bring eyetracking to the world. But I live in Canada. And I live in Ottawa, Canada. For those of you who don’t know, it’s a government town. It’s not very progressive. It’s a very conservative. Very conservative. Yeah. And technology is about 20 years behind the rest of the world. Yeah. So trying to bring an eye tracking technology to auto. It was a very tough sell. People were like, what’s it dragging? Do I need this? And they spying
Richard Joe 5:24
on us?
Deborah O’Malley 5:25
Exactly. Yeah. So it was it was very challenging. But what ended up happening is I got a job doing user experience testing with a local firm that serves government clients, Government of Canada clients. And so I got really great experience learning about how users use websites. And it was very related to eye tracking technology was just a different facet of it. As part of a training exercise for the user experience company, one of the guys told me, he said, there’s the site out there, it’s called which test one, it’s the site on a B testing, go and subscribe to it. That’s part of your training exercise, follow the weekly tests. And so I subscribe to each test one. And I was like, Oh, wow, this is really cool. Yeah, weekly AP tests, and you get to guess the test. And I followed it for many, many years. And eventually, they ended up having the job posting, they were looking for a content writer and a content marketer. And it fit my skill set perfectly. So I applied, I ended up getting the job, and ended up getting on with which Deus one, which just one, if you haven’t heard of it is sort of like the precursor to the company I found that gets the test. And it’s was full of hundreds of AP test case studies. Looking at what worked with a B testing, it was very successful, it was acquired by a company out of New York, rebranded, and unfortunately, just didn’t work out. And they ended up dissolving the company completely. And taking down hundreds of case studies that I had written, and access to case studies dating back from 2009. So this incredible history of case study knowledge, and having been a key content creator, that people were coming to me and saying, Deborah, what happened, I relied on these case studies for my teaching material or for validating test ideas. What am I gonna do now? Yeah, okay, well, with this gun, this is my chance to independently recreate, which is one and make it even better than it was before. So I independently created guests to test. And it’s been going strong ever since since 2018. So about five years now. And it’s a similar model to which does one over time I’ve developed it to really make it my own. I’ve added in a test trustworthiness section, which was always a criticism of which test one because people would say, These lists are totally exaggerated. How can I believe anything? And yeah, with each test, I evaluate, is this lift actually something that you can rely on? Is it trustworthy? And why or why not? And then there’s also a sort of in depth analysis where it examines, what about this test works, and why did it work? And how can you apply this to your own testing practice? Let’s, there’s also tons of articles and resources as well.
Richard Joe 8:19
And I’ve browsed the site. It’s very, it’s pretty cool. I like the interactive element of it. I think that that adds kind of a gamified approach to, you know, to CRO and AV testing, because, you know, you see these case studies, and it’s just just read about and you’re like, okay, whatever, but I think adding interactive elements of like, okay, okay, maybe I’ll go for that tip. Maybe I’ll vote for that test. Because, you know, based on this heuristic, or based on the psychological principle, and sometimes even I’m even on some of your tests that you’ve, you’ve featured, you know, we bring our own biases into into into, into into tes it’s impossible to be purely objective. But yeah, even I’ve been kind of kind of dumbfounded by some of the results of tests. And some of the results
Deborah O’Malley 9:11
are very surprising. And that’s exactly why we test is you never really do know what’s going to win until you test.
Richard Joe 9:18
Mm hmm. So it’s very, it’s very humbling. And but yeah, the about the, I think you did, you’d say you say that you added an A element where there’s a probability aspect or a concept of a confidence interval.
Deborah O’Malley 9:34
Yeah. So I call it this test, trustworthiness section. So I evaluate how trustworthy is this test result? So for example, let’s say a company most of the case studies are submitted to me by companies. And let’s say a company reports 237% uplift is that actually,
Richard Joe 9:53
I’m always I’m always like, you know, I gotta read these reports like, you know, this board and right $300 million. And I’m like, I’m always like dubious about these? Well,
Deborah O’Malley 10:06
basically, what I’ve learned over time is the bigger and the more. Wow factor, the test result is, the less trustworthy that actually is. So anything that looks unusual and surprising, probably is, and therefore can’t really be trusted. That’s women’s law.
Richard Joe 10:25
Woman’s lawyer, I was gonna say that anything, I think, was a rolling cart caveat set up, I’ve heard him him talk about like, maybe you like, you know, think, okay, you got to sift through the data, don’t have saved any BS. You just got to be like, very stringent, like, even from like, Thomas law has told myself like, you know, to not fully rely on you know, the testing tools results, like Optimizely or whatever. Because, you know, sometimes when I go in GA, it’s not like, it’s, like Optimizely might say, hey, you know, this is Rich stat. So you know, GM, GA, and it’s, you know, the numbers aren’t always adding up so much. So I’m, I’m always like sifting for things and segmenting data. And, you know, between mobile and desktop, and so forth. Sure. Yeah.
Rommil Santiago 11:12
This is Rommil Santiago from experiment nation. Every week we share interviews with and conference sessions by our favorite conversion rate optimizers from around the world. So if you liked this video, smash that like button and consider subscribing it helps us a bunch. Now back to the episode.
Richard Joe 11:26
Just doesn’t just want to point on your background. It talked about kids experiences with wasn’t negative. The kids or the sorry, the the other experience were, I think it was like a negative negativity bias aspect to testing. Is that Is that Is that what you’re alluding to?
Deborah O’Malley 11:46
Yeah. So in my master’s, I specialized in a theoretical framework called message framing. And a message can be framed in a loss framed or again, framed way. Yeah. So last frame does essentially spinning the message in a negative way. And again, frame is basically put in a positive way. And it’s typically used in health behavior prevention or promotion, when you want to get somebody to take action, for example, for cancer screening, or to stop smoking, you can frame it in a gain or loss frame way to stress the benefits of taking preventative action, or to talk about the risks of not taking action. Okay. It’s also it actually stems from behavioral economics. And it was examined by Tversky and Kahneman in a gambling and economic context, and people will risk more if they feel like they have more to gain, even though they actually may end up losing more. And so it’s all about mindset and loss aversion. And what people feel they have to gain.
Richard Joe 12:59
In saying that, and saying that they think that we are developing tests that say, I’ve got a, I’m testing, which I say, you know, just throw out warnings as well as thing. But say, with testing pricing, that we should sort of bias towards loss aversion type tests where, you know, let’s say it’s an Add to Cart EECOM. You’ve got those timers that I hate, but you’ve got the timer, that’s like 10 minutes or five minutes these days. Yeah. And the checkout that’s, would you would you bias towards testing in.
Deborah O’Malley 13:34
So urgency and scarcity are definitely powerful motivators. They can get people to convert, they have to be used in a real context. People I think, have become quite skeptical of, for example, countdown timers, or hurry, there’s only five items left in store. Yeah. And then they go back to the site the next day, and there’s still just five items left in stock.
Richard Joe 13:56
You know, what I do? I go incognito? I’m like, Nah, you’re telling DVS? I mean, it goes, it goes back up to 10 seconds. Like, there’s no five items?
Deborah O’Malley 14:08
No. So there’s a lot of scams and manipulation out there. And I think a lot of users have become savvy to that over time. I think our these tactics used to work better several years ago before they were kind of known by the average user. But they still can motivate people and they’re powerful motivators, because urgency and scarcity are biologically built into us. We have a protective mechanism from when we were, you know, cavemen with lizard brains to try and find food or the berries on the tree and get the last berries because that’s what we needed in order to eat and we still have that mindset. Now it helps us survive. And so when we see five items left in store, hurry, this deal will only last for another 24 hours. It does to some extent, you know, trick our brain into going okay, I have to you can I
Richard Joe 14:59
mean even if For me as an experienced marketer, like I mean, I’m just like, even though I get a bit sort of tuned out by this stuff, and even a fake, it does still affect me because I’m like, Okay, there’s a special on. Okay, they’ve only got a few lifts items. Exactly. You know, it’s just I don’t know, like, because, because I’m like, okay, maybe they are, maybe they are being true and not really honest there, they’ve only got like a few items left. And I’m just like, I’ve got to get this discount now. Otherwise, I’ll miss out. It’s, it’s amazing how we’re just hardwired. Maybe it’s that reptilian brain aspect of us to survive, and its preservation sort of mindset of, okay, I’ve got, I’ve got to do this to preserve the status quo or to get what I want to survive. I’m just another point on the text versus images. Because because I’m thinking in terms of, you know, I’ve experimented with, you know, suddenly just say, an econ website, Google text ads versus Google Shopping. Just let me Please deal with the reserve gear. So I found that I got more conversions at a lower CPA with Google Shopping, then, and a lower CPC, with Google Shopping, then Google takes ads, for me, and specifically for E commerce. So would you say that your research is more generalized, then say, as opposed to specific?
Deborah O’Malley 16:33
Yeah. So it was like, Yeah, specifically print ads, as opposed to, like, you know, web based ads or Google ads. And it was basically what do people attend to more so attention attend to is attention, what are you paying more attention to. And in the eye tracking research I did that was measured by eye fixations. So how long somebody is staring or looking at the screen and actually concentrating on the screen. The eye fixations are a metric of dwell time, so how long somebody’s dwelling or looking at the screen. And then I also looked at attention and recall of the advertisement, so how much you were able to remember, once you saw it, so it’s pretty well known that you can process images much quicker than you can text, text requires visual, you know, you have to process it visually, and then internalize it to understand that there’s cognition that has to take place on that slower than just being able to immediately see something. So it’s not surprising that people would pay more attention to text because you need to do that in order to cognitively processing. But in order to recall it, you actually need to start to have it go through your brain and have a processor. And people process the text based ads more because they were attending to the more they were paying attention to them more.
Richard Joe 18:00
So would you say that you know, based on so that kind of lens? You know, kind of Ns principles is a text is text more system two and then images are more system one based because yes, yeah, I mean, yeah, because in images like you just you’re not thinking that much cognitively generally speaking like it’s more intuitive whereas text you’ve actually got to read the damn thing and understand it. I’m just insane. Just just just saying that you because you did it. You did it. You did this tests. On prints, would you say that to follow up in our digital age that you could you’d like if you had your time, if you had time you would follow it up in the digital context.
Deborah O’Malley 18:45
I think that’d be really interesting. Yeah. I think there’s so much that you can do with eye tracking, eye tracking never really fully caught on. So I’m dating myself now but when I did eye tracking, it was in 2007. I did my masters so quite a while ago now and machines had evolved from several years beforehand you actually needed to rest your head in a chin strap and sit there is still as you possibly could and look at the ad the screen when I did it, it was a screen a standalone screen that sent infrared signal into your pupil and then bounced off and was able to map the xy coordinate on the screen to see where the eye was looking. But you still had to be very still you can move your head around
Richard Joe 19:35
a light because I know is out of interest with my wife has done one of those you know you can sign up for these research companies and they give you like a voucher if you come into their studio and do research for whatever I think she did. Yes, she she was she and this is how about audiences because she was looking at Southern on the screen. I don’t know what it was, but she was like okay, how Well, do they know where my eyeball is? up here or down here and in? So you just to clarify, you’re saying that there’s some laser?
Deborah O’Malley 20:09
Yeah, especially red signal, at least that was, you know, the technology I was using it was a Tobii eye tracking machine. It may have evolved since then. But I think it’s still fairly similar. And there definitely still is eye trackers out there. And devices like the Microsoft, Kinect Kinect, IQ, I think it is. And Toby still exists as a company, and they have these very sleek eye trackers, now they’re about the size of a cig ruler. And they’re essentially a webcam, much less expensive than the $25,000 screens that I was using. Yeah, but I believe, don’t quote me on this, but I believe the technology is still the same. So it’s an infrared signal that is sent out from the camera bounces off your eye, and back to the camera, where the xy coordinate on the screen is mapped. So it’s basically like, you can tell Okay, pixel, you know, 863, right there in the middle of the screen, that’s the xy coordinate where it is, and that’s where the person is looking. And from my tracking, you know, it’s evolved even more, there’s no face tracking, there’s a motion tracking to see how people are looking and feeling and expressing emoting their faces. So it’s gone in a really interesting direction. But from an experimentation industry, what really stemmed out of eye tracking that I think is really fascinating, is heat mapping. There’s all sorts of heat mapping tools now. And I find it invaluable in doing conversion rate optimisation, maybe testing and establishing really data driven hypotheses. Because while you can look at analytics data, and form a sense of what users are doing, and how they’re behaving on your site, you don’t really know because you don’t have that visual picture. And I’m biased, I’m very visually oriented. But as soon as I see heat mapping data, I go, Oh, okay, that’s exactly what’s happening there. That button has been clicked or not clicked, people are scrolling down the page, you’re not scrolling down the page. And you can really start to paint a very detailed picture of how users are behaving and what they’re doing, or potentially not doing on the site, and what needs to be optimized as a result.
Richard Joe 22:21
What’s what’s the kind of pros and cons of heat maps, you know, don’t remotely at home versus in person testing,
Deborah O’Malley 22:31
I think the pros and cons are pretty obvious. In a laboratory setting, people don’t necessarily act how they would in real life, when they’re actually browsing. So you don’t get a very true sense of how users are actually behaving. Whereas when they’re in their natural environment, on a desktop screen, or even on a mobile screen, that’s more likely how they’re behaving because they don’t feel like they’re being watched. So as much as possible in a natural environment is definitely the way to go. People definitely change what they say and what they do. As soon as they feel like they’re being monitored. And I know that firsthand, from doing user experience studies, I would run these government of Canada user experience tests and ever been on the Government of Canada site. They’re atrociously off. Yeah, you can’t find any information
Richard Joe 23:19
websites are pretty crap, you know? Yeah.
Deborah O’Malley 23:23
Yeah. So and these are, you know, important services that are offered like the Canada Revenue Agency deals with taxes and Employment Insurance deals with getting money if you’ve been laid off your job, like, these are services that people need. And they need to be able to navigate the website, but they just buried and bureaucracy basically. So I would run user experience tests. And I would ask people at the end of the test, first of all, I should back up and say their job would be to complete a top task, which would be to find a certain aspect on the site or navigate to an important page, that kind of thing. Most of the time, people would not be able to successfully complete the task, at least not within the allotted amount of time. And if they did, they would really kind of stumble around and, you know, not get to the right place very efficiently. And at the end of the study, I would often ask people, can you give me any feedback? What did you think? And people would say, Oh, it was great. The site was really easy to navigate. That’s very clear. And it was because that’s what I want to hear. Yeah, yeah. And I really learned from that, that people do not say what they mean. And there’s a top UX researcher, his name is Gerry McGovern. And he gave a really, I think, precious example of that. And he gave an example of, he worked with an airline company, and the airline company asked people, what do you want on this airline? What would make you happy? And people said, well, we want fresh fruits and vegetables and we want to eat healthy. And so the airlines and all this money acquiring fresh fruits and vegetables and trying to keep them fresh on an airline and then playing. And every time people would gravitate towards the candy and the chocolate bars and all the unhealthy food, and they’d have all these extra fruits and vegetables sitting around writing, as the airline went, Oh, what’s going on here? People told us they want fresh foods and vegetables. And yet nobody’s eating them. They’re all eating the junk food, what is happening? Yeah, and they realized people say one thing, and they act another way. And I think that’s a really important UX principle to keep in mind. And that’s why in my opinion, user experience testing is not nearly as powerful as a B testing, or even heat mapping where AI researcher doesn’t get in the way, it’s in an unmediated environment. And users are behaving naturally, as they would without any prompting, or without any kind of swaying the user in one specific direction. So when users say something, they more often tend to mean it. Because they’re not saying it to an individual, they’re behaving in a certain way that is in line with how they actually feel.
Richard Joe 26:11
I’m not I’m gonna miss them, what’s the main advantage of in person versus heatmaps? Because in person is still done by, you know, firms that can afford it. So yeah,
Deborah O’Malley 26:20
it’s still a valuable exercise, I think the main advantage, at least in my experience, is that you can really target in pinpoint questions to user and you can observe their browsing journey or completion of top tasks. And then you can ask them, okay, well, what did you find confusing there? Why did you end up going there. And you can do it in real time, and get really valuable feedback. And just forming that personal connection with your user, you start to understand your audience a little bit more, if you know the user’s part of your cohort or demographic, and you get a sense of who your users are, and why they’re behaving the way they do, in a way that’s much less anonymous than if it was, you know, just in a removed setting. Let’s just
Richard Joe 27:07
say you’re, you know, you’ve got this cohort that needs to be, you know, female between 20 and 40, and blah, blah, blah. And you can talk and have a conversation with them in a more in a very in depth way. And really understand those user personas a lot better than hate net, which obviously doesn’t segment by personas, because you’ve
Deborah O’Malley 27:26
got so it’s really, it’s the voice of the customer, you’re actually literally speaking with the customer, and you’re hearing from them, you’re getting their personal experiences, people will often start to tell you their stories, or, you know, what motivates them in certain ways. And you can really dig in and have a conversation and establish a personal relationship with typically a smaller subset of people is a much smaller group. So it becomes a more qualitative exercise. But that can be really valuable in getting to know your audience and establishing that relationship with them that you can’t really do in an anonymous way, when it’s larger scale and removed from one on one setting.
Richard Joe 28:07
Would you say that the way to get around that sort of downside of heat maps where it’s very anonymous is to integrate heat maps and recordings, we’ve also use a polls in specific key segments on the user journeys.
Deborah O’Malley 28:21
Yeah, I think, you know, in my experience, and opinion, the more data points that you can have and blend together, the better. So a lot of people sit in, you know, a CRO camp, or a UX camp, and or a qualitative camp and a quantitative camp, I think if you can blend those two, you can get really, really powerful insights, and form a much more informed data driven hypothesis. And an interesting example, this is a real life client that I worked with. They came to me and said, Deborah, our site is not converting well help us. I looked at the heat map and data. And what I saw is people just weren’t clicking on it was a landing page. And the call to action was a free trial. People were not clicking on any of the free trial buttons. And I said, Okay, well, I think we need to redesign the page here, bring out the value proposition of the copy. The heat mapping data indicated, people were not scrolling. engagement was low. And we redesigned the site and the client said, Well, things still aren’t converting as well as we wanted, what’s going on here? And the answer was, I don’t know. I’ve applied all the best practices that I possibly can. Let’s now use a customer exit poll and find out from customers why they’re not converting. Yeah, that probably should have been done. First. It’s easier. But you know, at least we’re able to now combine the heatmapping insights with zero best practices, and now, excellent poll technologies. And what we found out is people just didn’t feel ready to start a free trial. If it was too high stakes for them, even though it was a free trial, people just didn’t feel ready to take that step. Yeah. So there was more kind of nurturing that needed to be done and more reassurance that needed to be done before people were ready to convert. And so that was really valuable. And that’s something that analytics data didn’t tell us. He didn’t tell us. It wasn’t until we actually spoke with the customer, and found out from them what’s going on, that we’re able to then apply all those insights together, and kind of crack the code, and then from their conversion rates. Yeah, that’s awesome.
Richard Joe 30:39
So I think the lesson here is, don’t just rely on purely quant or qualitative data only, we’ve got to kind of
Deborah O’Malley 30:47
merge the two. And different pieces of the data tell you different stories. Another really interesting example. And this is a real life test that’s featured on guests the test. It was submitted by McClatchy media, which is a large media company. They started off they said, Okay, we run a lot of classified ads, paid ads in newspapers. But classifieds just seems like such an antiquated word. People don’t really use classified anymore. Maybe we should change it to buy and sell, maybe that will be more meaningful for people. And so they started off doing user experience testing. And they said, which would you prefer? Which would you navigate to on the website, and they asked people, the large sample of people, and they said, just buy and sell mean more to you or just classified me and my team, and user experience testing revealed that buy and sell was actually the better way to go. Yeah, so the company said, okay, great, we have that user experience point of view. We also know that users say one thing and act another way. Yeah, let’s now AB test this and actually validate this insight with a B testing. And when they ran the A B test, what they found was actually the exact opposite, classified converted much better, because that’s what people are used to in a newspaper context. And so it was only one single word that they changed in this case in their top nav, they changed it from buy and sell to classify it and classified, I’d have to look at exactly what the conversion rate was, but converted much better at a statistically significant effect. So you can do different methodologies, and they don’t always add up. The choice then is, is the AP test, right? Or is the user experience test, right? I’m gonna say go with the quantitative data, go with the AP test, because as long as it’s run properly, and you’ve done statistically significant results, with a large data point like that, the data probably is going to be more accurate. But it shows the importance of blending modalities. If you just go with one single modality, you have a sort of myopic perspective, and you’re not necessarily going to be tuning in to the broad swath of users, you’re just going to be getting a small sample who say, potentially say one thing and act another way.
Richard Joe 33:05
Just just getting back to heat maps. You know, how do they help us to develop qualitative data to help guide our hypotheses in line with quant data? And this is showing a good question for beginners.
Deborah O’Malley 33:23
Yeah, so for me, heat maps are absolutely essential. I will not audit a client site without your mapping data. So anytime I talk with a client, I say, you probably already have analytics data. Most people have Google Analytics installed. Before I start working with you. We need to install heat mapping. And nobody has an excuse anymore, because with Microsoft clarity, it’s free. Yeah, hot jar is not too expensive. Lucky, Orange is pretty minimal in terms of cost. And then there’s other options like crazy again, yeah, a whole bunch of heat mapping options. So that that’s step number one. From there. I have a kind of data driven process that I’ve developed over time, but to answer your question, most specifically, I use the heat mapping data to look both a desktop and mobile. Yeah, I look at heatmapping scroll mapping session replays. And I use it to paint a picture of how users are behaving on their site, in particular, what they’re clicking, or what they’re not clicking, I think what they’re not clicking is just as important. And looking at top nav engagement with that, looking at key CTAs looking at are people clicking through reviews or scrolling through reviews? Are they engaging? Are they clicking videos or not? Are they getting further down the page or not? And over and over again, through analyzing at this point, hundreds of sites I’ve seen very specific trends For the most part, the hero section, the top section of the website before, its new user has to scroll down the above the fold section. Most users do not go beyond the hero section on the homepage. So the hero section is absolutely your prime piece of real estate, to capture the user. And as a result, the message has to be saline, it has to be very clear, has to be compelling, it has to contain your value proposition has to explain what your product your services, and it has some benefit. And that’s an awful lot I can do here a section. But most companies do not do a very well. They assume that users know what the product or services, and they offer no background whatsoever on what the product is services. And they have some cutesy phrase like, you know, whatever it is, that doesn’t mean anything to the user. And then they expect the user to convert and wonder why conversions are loved. Yeah, so social proof is key. In that section, you absolutely need to infuse a sense of credibility into the hero section. So that when the user lands on the site, they feel like this is a product or service that can be trusted. Yeah. A very clear value proposition and a clear strong call to action button. Are
Richard Joe 36:23
your of your you’re quite adamant about putting a CTA button on the above the fold region I most website different
Deborah O’Malley 36:31
evidence. So I’ve anecdotally seen people refer to case studies where there is no CTA in the hero section. And they say, Oh, it doesn’t matter. Yeah, that’s up to
Richard Joe 36:45
date studies to like was Neil Patel, it was hit or you know, maybe it’s overrated, and they will scroll down. And then if they’re engaged by his or scroll down and then hit the CTA, wherever it is, I can’t remember if Neil Patel was someone else. But yeah, I heard it debate.
Deborah O’Malley 37:01
Yeah, it’s definitely debated. My theory, my line of thinking is, maybe maybe the user will scroll. If you’re one of those few lucky websites, whether you use it as scroll, fine. But if you’re like the majority of sites that I see where users aren’t scrolling below the fold, then don’t waste that precious precious real estate by not giving a key call to action. Now, with that, make sure the call to action is the right one. Because your objective as a website owner might be to get the person to buy something or sign up for a demo, whatever your key conversion objective is, the user may not be ready at that entry point. And that really funnel stage to take that high friction action. So it may be that you’re better served to bring them deeper into the funnel, and get them to explore get them to engage. So exploring the right call to action is paramount. But I think it’s absolutely essential. You put a CTA button there, otherwise there’s real estate that is not being utilized properly.
Richard Joe 38:11
No, I agree with you. I mean, I guess, to end the debate, I guess you could put several CTAs.
Deborah O’Malley 38:18
I think that something that’s actually under tested right now is the optimal number of CDs. You see one that’s common. Sometimes you see a CTA and a child like that’s fairly common. You often do see two CTS, so primary CTA and then sort of like a ghost CTA, a secondary CTA. Yeah, but very, very few websites, use three CTS. And in some of the guests to test case studies that I’ve seen three CTAs, when positioned properly, and when offering the right choice to the users actually can perform better than just one or two CTS. And presumably, the reason is because you’re giving users the choice to find what they’re looking for specifically. And so it can be very powerful to do that. Now, choice can be overwhelming. And visitors who have too much choice are going to be confused or hesitate, and then won’t convert. But if you’re offering the right choice to bring people deeper into the funnel, it can be a very powerful strategy. So I think that’s a really interesting and often underutilized test is what is the optimal number of CTAs as well as what CTA should we be featuring.
Richard Joe 39:35
That’s a good idea. I’ll add it to my testing suite. v one, v two V three, and we’ll just run them in parallel. Like, you know, yeah, and then I never really thought about testing and the optimal number in that regards. Just just saying that, would you say that this principle still Across b2c and b2b, because, you know, obviously b2b As long as sales cycles, maybe what you get on a sales call and that sort of thing.
Deborah O’Malley 40:12
Yeah, so I’ve been fortunate to work across almost every industry vertical, b2c b2b, do you see? Yeah, you know, p2p, if you will, every, every vertical, every sector, I’ve either worked in it myself, or I’ve seen guests that test case studies in it. And although every industry and every website is indeed different, and therefore, testing is required, there are certain trends and patterns that persist because people at the end of the day are people. And they want to be exactly they want to be talked to, like they’re humans and treated like they’re humans. And we all have the same cognitive biases, and we all have the same instincts and that kind of thing. And so those very primary and basic principles can be appealed to, and we can be persuaded as a result. So, yes, you need to tune into your audience, you need to know your audience and know what they’ll respond to. And that’s where, you know, b2b, or b2c, or whatever it is, comes into play. But trends, I have seen it so I can say this with confidence trends and persist across sites and across verticals. Any other any other pointers that, yeah, so the process that I use that I’ve developed over time, and it works really well with clients, is I actually start off by looking at the analytics data, I get a sense of where the traffic is coming from, I try and form a picture of who the audience is, the exercise I give myself is looking at the analytics data, can I find even just a stock photo that identifies who the audience is in terms of age, gender, ethnicity, that kind of thing? And I take information from, you know, what browser, what device type? Are people coming from? Are they on iPhones? Are they on? You know, back in the day, were they on? Internet Explorer, you know, those types of things will give you good clues as to who the audiences. And then from there, what are the most popular pages where people? Where is traffic primarily coming to? Where is it exiting? Where is it bouncing? What traffic sources and channels are users coming from? How are they interacting on those pages. And once I have that formulate in my head, and I have a sense of the audience of how they’re behaving, again, okay, now I understand the audience. Now I understand where they’re going on the site. Let’s look at those pages and see what they’re doing on the site. And I’ll look at the heat map and then the scroll map. And sometimes the session replays to go, what are they clicking on? What are they not clicking on? How far down the page are they getting? What are they paying attention to? What seems to be important to them. And Trends tend to jump out pretty quickly, maybe in part because I’m pretty adept at reading the mapping data, given my tracking background. But things just jump out. They don’t, I can go okay, that’s exactly what’s happening here. People are clicking on the products in the nav, but they’re not scrolling down to see the products because they’re too far down the page. Yeah, for example. Or, people don’t have a good sense of what the product is. Because in the hero section, it’s not well described. So people aren’t clicking to shop, because they don’t know what they’re shopping for, you know, those kinds of things. So you still have to have a little bit of conjecture and assumption into what people are doing. But the heat mapping combined with the analytics data, starts to create a really good picture of how users are behaving and why they’re behaving that way. And then from there, you can start to form a hypothesis. And in the experimentation, talk that I gave last year, I talked about what a hypothesis is and the formula for a smart hypothesis. And a hypothesis is more than just an educated guess. It’s a statement that you basically try and validate. And you really have to identify who you’re targeting, as well as what conversion objective you’re trying to measure out of it. And you really can’t run a good A B test unless you identify those points,
Richard Joe 44:43
because we’re good so that so you’re you’re doing the quant data on GA are you using it to also prioritize where in the funnel that you’re going to be doing the potential or hypothesis and resulting tests?
Deborah O’Malley 45:04
Yeah, so I look at a few different combinations. But typically I look at the top 10 most viewed pages, I look at views by actual traffic or sessions on the page. Yeah, exits, bounces, and conversions. So if you have, for example, a high traffic page with very low conversions, that’s an opportunity right there that needs to be identified. Why is traffic high, but nobody’s converting on this page? What’s going on? And so then that becomes, typically what I’ll do for clients is I give them a top 10 recommendations report. So based on the pages that I identify as high priority for from a conversion standpoint, here’s the top 10 things you can do to optimize or to test optimize those pages. And so you look for clues like high traffic, but low conversion rate, very high exit or bounce rate. And also high traffic and high converting, because if you can nudge the conversion rate even higher on those pages, you can bring in more revenue. So yeah, looking at, you know, basically, if you want to categorize that the kind of most popular pages are the pages that have the highest conversion value, in one sense or another because the opportunity is greatest on those pages. Generally, if it’s a e commerce or Shopify site, most Shopify sites tend to be relatively small in that they have the same structure, they have a homepage, they have a collection page, they have a PDP page, they have a cart page or CART drawer, they have a checkout page, which can easily be modified. And then they have a thank you page. And so you know, it’s pretty easy to pinpoint those pages. And to start to be like, Okay, well, we know you only have six pages on your site, Let’s optimize all six of those pages. If it’s a large SAS or b2b site, sometimes they have 10s 20s 30 pages, it starts to get harder to focus on exactly where you want to start, what you want to optimize, but you can look at traffic and conversions, and use that to kind of hone in on where you’re going to focus your energy for conversions.
Richard Joe 47:25
Yeah, just another thing is this for our audiences, common mistakes that people make, when it comes to using these tools, these tools being human being either heatmaps Yeah, common common mistakes, what sort of common mistakes that people make,
Deborah O’Malley 47:42
I think one really important thing to look at is number of sessions, that heat mapping tool has recorded, either over time, like over a certain time period of time stem, most heat mapping tools, record data and retain data for 30 days. So you have a 30 day window. But if your chat, if your site is low traffic, and you only get, let’s say, 100 visitors over those 30 days, well, then you don’t have a very large sample on which you’re making a lot of assumptions. Yeah, so you might see a certain button being clicked or not clicked an awful lot. But that’s only 100 people. And you know, that’s not very representative, potentially, of how people may behave over a longer period of time and how most of your traffic actually behaves. So that’s one really important thing to look at. And also to report like, if you’re forming a report for a client, I always put the number of sessions and the number of clicks or percentages of clicks over the time period reported. Because you can really skew things if you report it over one day versus 30 days, for example. That would be the most important thing. Another important factor, as you said, is having that end goal in mind, because you can look at the heat mapping data, there’s all sorts of bright colors and kind of rainbows, and it’s entertaining to look at from that perspective. But if you don’t know what you’re trying to hone in on, then you can get really lost. And so you do really have to have a hypothesis that you’re going in with first and use the heat mapping data to validate or to the opposite of validate, which I guess would be to, you know, reject the hybrid case. Yeah, yeah. And say, oh, no, that’s actually not what’s being indicated here. Users are doing this instead, or, yeah, I had a suspicion people were doing that based on my knowledge of the site. And based on the analytics data and heat mapping data is indicating exactly that. Bingo. So going in blindly without a preconception, I think is dangerous because then you You’re just kind of going around. And looking at lots of pretty colors and pictures, you don’t really have a metric at which you kind of try to chase something down. So that would be the other thing. And then I think session replays can be very dangerous because they are so time consuming. Oh, and you’re looking
Richard Joe 50:21
out was like, I just like it just numbs my mind, if I spend like, two, three hours, like looking at people clicking,
Deborah O’Malley 50:29
I think we forget about that is Yeah, optimizes and marketers, like, people are on the metro. People are at home and kids screaming in the back there in the real world, right? Yeah. And so you know, they, their mouse is going over here. And then there’s no activity for three minutes, you’re like what’s going on, you’re supposed to be looking at this. Because there’s, they had a diaper, they had to change in the backend, or whatever it is. Yeah. So keeping that in mind, and not getting too hung up on the session replays. A lot of the tools haven’t so that you can scrub through the replay at a higher speed. I think that’s very useful to do. But the other tip that I have is a lot of the tools you can tag activities that the user has taken in the session. And so it’s useful to look at, okay, all users who went to the homepage, and then to the PDP page, but didn’t add to cart. And if you can break it down by those specific behaviors, then you narrow down the number of videos you end up watching. And you can start to say, Okay, now I see that all these users who performed it didn’t perform this action. Maybe not all of them, but a large swath of them are doing this instead or not doing this. And now I have a hypothesis. And my hypothesis is that the call to action isn’t clear enough, or there’s not enough trust or whatever it is.
Richard Joe 52:03
I’m just one more question. And this is this has got to come stakes. I mean, for me, it’s like, how do you know that? Let’s say you see all these dead clicks on these USB icons on this product page, or the homepage or whatnot. And you think, okay, you develop a hypothesis that, oh, well, we meet we make those icons clickable to either show a pop up with information or go to a URL that, you know, the user will be more engaged and excel rates would decrease on this page. Do you think that that could also be a dangerous thing to
Deborah O’Malley 52:48
do as well? Because sort of, yeah, definitely. So I have a really interesting example of your life client site, they had a page so that the end goal was to get somebody sign up to get a free consultation, a free consultation resulted in somebody actually calling you. It was for a health and beauty product, and the page where the the signup was, had a number of promotional sales, and, you know, listed get 50% off this get X percent off this. And the icons were not coupons, and they were not clickable. But they were formatted from a design perspective to look like they were clickable. Yes. And so what he mapping data very clearly showed is, people were clicking on them, and you can form a hypothesis from there, then, people are clicking on a non clickable element that looks like a coupon, and likely getting frustrated. The other thing that the heat mapping data showed us many more people were clicking on what appeared to be the coupons, as opposed to the actual form where they were supposed to submit their contact info and get a free consultation. And so that heat mapping data was super valuable in indicating, okay, we need to make this design aspect not look clickable. Because, number one, people are probably getting frustrated. Yeah. And secondly, and most importantly, that interaction or that, you know, presumed interaction is creating a distraction from what our key performance indicator is, which is increasing the number of people clicking to submit for free consultation. And so I operate with the mindset that people have a limited amount of bandwidth and they’re going to engage a limited amount of time on your page. Don’t waste that engagement on elements that don’t need to be interactive or look clickable, but are not get you Both to click only on key elements on the page that have meaningful interaction. And in some cases, that may be a testimonial or testimonial slider, and putting it midway down the page and having it be clickable and interactive. So you can go through and look at each testimonial, kind of arrow through. Sometimes that’s valuable because it gives the user a break, especially on a long form page to go through and to see what people are paying or saying, and, you know, creates heightened engagement. But if every element on the page is clickable and interactive, then by the time they get to your CTA, you’ve probably created fatigue, and people aren’t gonna want to click on your CTA. So limit the number of clickable activities on your page. Keep them only two key performance indicators or conversion objectives.
Richard Joe 55:59
Yeah, I guess that’s a bit of an art and a science too, isn’t that because? Yeah. Those are many of the details, obviously. But because this is for the company I work for, but yeah, it does remind me of a test that we ran where it was on a table, and they were cooking. Yeah, like they were cooking. I was like, why they’re clicking these damn. Like, why are they clicking on these elements on this table? It’s it’s not it’s I’m not sure. Is it? Is it high engagement? They want to see what’s behind it. But yeah, one of the comments that other people raised was like, yeah, it does actually look like it was clickable, because it’s, there’s a bit of shading, right, kind of around each cell. And you think, oh, it’s like one of those, you know, like, it’s a button. Okay. We’ll click it. So particularly on mobile, I’m guessing they, you know, particularly on mobile, in particular, show that. Yeah, they were clicking like crazy. And, obviously, that regard today, it’s probably draining their system, too. So when it comes down to the CTA, they’re like, ah, oh, forget about it. And
Deborah O’Malley 57:07
yeah, I guess that’s another important point, too. You asked about, you know, sort of pitfalls of heat mapping data. The heat mapping data is number one, it’s not in perfect. And number two is not always 100% accurate in terms of where exactly on the page, people are clicking? Yep. So especially with mobile, where people are typing in a sort of a larger section on the page, you can’t always pinpoint exactly, so take it with a grain of salt, it will often be off by a few degrees or a small margin. And if you think well, why are people clicking, you know, on that part of the page, and there’s nothing there at all, then, you know, those types of things can be ignored. Or sometimes there is a market HumanIK there, because people are clicking to scroll down the page, or tapping to scroll down the page, even though there’s actually nothing on the page at that particular spot. So take it with a grain of salt. It’s not absolute truth. Oh,
Richard Joe 58:11
yeah. Very good point. And just to add to that, I think, because I’ve seen tips on mobile, where you’re like, you know, I thought it was like an actual user money tap. And my colleague was like, maybe they just scrolling. Look, it’s been good having on the show, we talked a lot about you know, your background and how your degree is led to. Yeah, the one for word of car and all that. How can people contact you, Deborah?
Deborah O’Malley 58:41
Yeah, sure. So I’m quite active on LinkedIn. So anyone can add me on LinkedIn. It’s slash Deborah O’Malley. I’m on experimentation. And I’m one of the consultants available on experimentation so people can get in touch there. And anyone is welcome to go to guest the test guest the test.com. Check it out, as well as my consulting site convert experts. And if anyone wants to contact me by email, Deborah, at guests, the test.com is how they would get a hold of me. I’m happy to answer any questions and just be a resource for people. I’m pretty passionate about zero and ABX testing. So if people have questions, I’m happy to help and you know, lend a hand where I can.
Richard Joe 59:25
Awesome. It’s, it’s been good having the show at Zebra. It’s been a long time coming. Yeah. Thanks for being on the show. And yeah, we’ll catch you next time.
Deborah O’Malley 59:33
Okay, well, thanks so much.
If you liked this post, signup for Experiment Nation’s newsletter to receive more great interviews like this, memes, editorials, and conference sessions in your inbox: https://bit.ly/3HOKCTK
Connect with Experimenters from around the world
We’ll highlight our latest members throughout our site, shout them out on LinkedIn, and for those who are interested, include them in an upcoming profile feature on our site.
Advertisment
Categories
- Adventures in Experimentation (9)
- Analysis (2)
- Announcements (92)
- Ask the Experiment Nation (11)
- Certified CROs (193)
- Community (2)
- Conversation Starters (2)
- CR-No (5)
- CRO Handbook (4)
- CRO Memes (18)
- Experiment Nation: The Conference RELOADED (41)
- Experimentation from around the world (2)
- For LinkedIn (165)
- Frameworks (2)
- Growth (14)
- ICYMI (2)
- Interviews with Experimenters (210)
- Management (1)
- Marketing (11)
- Opinion (8)
- Podcasts with Experimenters (15)
- Point/Counterpoint (1)
- Product Experimentation (5)
- Product Management (9)
- Profile (9)
- Question of the week (5)
- Sponsored by Experiment Nation (1)
- The Craft of Experimentation (1)
- The Cultures of Experimentation (1)
- The weekly buzz (13)
- The year in review (1)
- Uncategorized (352)
- Weekly Newsletter (184)
- Where I Started (2)
Related posts:
- About our new Shorts series: CRO TMI - November 8, 2024
- Navigating App Optimization with Ekaterina (Shpadareva) Gamsriegler - October 18, 2024
- Building Your CRO Brand with Tracy Laranjo - October 11, 2024