Softtek's Angeles Alonso Leverages UX Research to get the most out of Conversion Rate Optimization - Experiment Nation

Softtek’s Angeles Alonso Leverages UX Research to get the most out of Conversion Rate Optimization

A conversation with Softtek’s Angeles Alonso about Experimentation

Hello Angeles, how are you? Thanks for joining us today. Could you share with us what is it that you do and a brief overview of how you got to where you are today?

I am a digital analyst specialized in CRO strategies. I started many years ago as a front-end programmer and web designer, this led me to interaction design, usability, accessibility and research with users, that is, the entire spectrum or umbrella of UX (User Experience). Later I was trained as a digital analyst to complement my work and in this way have qualitative and quantitative analysis covered in order to design and develop projects focusing on conversion.

As someone who’s spent a good amount of time in UX as well as CRO, I was wondering what your thoughts were about how these two disciplines work together?

CRO is a discipline that draws on several traditional work areas, including the user experience. The first step in optimizing a digital product is detecting where the product is failing. We need to take the time to investigate what is happening and why our users are behaving the way they do. 

For this we have techniques that come from quantitative digital analytics, qualitative analytics and UX research.

For example, when we talk about quantitative digital analytics we talk about dimensions and metrics, measurement based on hits / sessions / users that collect tools such as Google Analytics or Adobe Analytics. We measure what is happening on our website or application.

To perform qualitative analysis we use any user research method that helps us understand the reasons for the problems people have with our website or application: 

  • Clickstream analysis: Clickstream data are a detailed log of how participants navigate through the Web site during a task. Provide tremendous insights into how easily the site is navigated, what pages are causing the greatest confusion, and what pages are critical in reaching a desired destination.
  • SSA (Site Search Analysis): it is a qualitative-quantitative hybrid research method. The voice of users is analyzed with quantitative techniques.
  • User test: users perform a series of tasks previously organized by us in order to identify the ways in which users perform them, what problems and needs they have to complete them. 
  • Analyze the feedback that users give through the tools used by customer or support departments. It requires some processing work to extract the main problems but is very useful information since the problems are usually important enough to cause the call / email.
  • Card sorting: consists of asking users to group content and functionalities into open or closed categories. Provides information and results on the content hierarchy, organization, and flow. User interviews can be used to conduct this research.
  • Interview with users: Interviews of groups of people or individuals are carried out. In this case, the UX Researchers investigate the feelings, opinions and even language of the users.
  • Heuristic evaluation: it is an expert analysis of a product where its characteristics are measured in terms of usability, accessibility and efficiency of the experience.
  • Brainstorming: is a design meeting where project members brainstorm ideas and solve problems.
  • Intercept Sourveys: the survey is launched through a “switch” on the site. They allow us to launch into specific segments or whenever we want.
See also  Informa Market’s Tony Grant on getting people excited about Experimentation

Now we have data and context, and we can formulate hypotheses that explain the detected inefficiency. The A/B testing or multivariate tests are the main tool to demonstrate with quantitative data the suitability of a change within a context at a given time. It helps us in the learning process of how to build a more efficient and optimal digital product.

What do you do when the results of an Experiment don’t align what the user research would suggest? How do you reconcile that difference?

I run A/B tests with the appropriate traffic to get meaningful results. In addition, I do them for a certain time to achieve statistical significance. If the tests do not conform to what the user’s research suggests, I still draw conclusions from it and implement them in further tests. The results are already giving me valuable insight and will help me rethink a new hypothesis and plan my next test. We have to bear in mind that being wrong is normal, even when doing a very thorough investigation. But if you don’t do this, the % success will always be lower.

What are your thoughts on heuristics? How much should we rely on them, and when should we decide to test them?

One of the techniques that I use the most is heuristic evaluations. As you know, they consist of examining the quality of use of an interface by several expert evaluators, based on compliance with recognized principles of usability. They allow us to identify interaction, usability and accessibility errors on our website or application. However, this technique is used less and less despite the fact that making them and applying those improvements based on these analyzes will allow us to obtain great results in a short time: “quick wins”.

Advertisement

See also  Specsavers' Melanie Kyrklund on How to run a high-impact optimization program

Changing gears. What are your thoughts about the term, “CRO”? Is it a title that effectively captures what the role entails?

When we talk about CRO, we are talking about a discipline and a methodology focused on improving the business efficiency of any digital product or service through the identification of points for improvement, generation of hypotheses, proposals for specific actions to be corrected and , subsequently a monitoring of the effectiveness of those actions.

Currently, there is tremendous confusion in the term. It is confused with SEO or with making changes in the purchase process or that it is exclusively for ecommerce. When I started in the world of user experience a few years ago, a part of our job was to evangelize what it meant to do UX, and now the same is true with CRO. There is a very important part that we professionals who are dedicated to this have to do and that is to teach, inform about this discipline, its methodology and technique before starting a CRO strategy. It is important to understand from the beginning that doing CRO means working on a continuous improvement process, it is not only about increasing the conversion rate but also optimizing the sales strategy of our business.

To achieve an improvement it is necessary to tackle a problem or an inefficiency from several points of view in order to find the best possible solution.

See also  Autodesk's Heather Voden on scaling marketing through Experimentation

Finally, it’s time for the Lightning round!

Bayesian or frequentist?

Well, I don’t know what to tell you, it’s a debate between two approaches, does it matter which one you use? In the end, I expect fast and accurate results that are easy to understand and be able to communicate to the business. Regardless of the method, the test method and statistical precision will determine the final results.

What is the most over used research method, in your opinion?

I could not tell you which is the most, what if I can tell you is the least used, the research methods that come from neuromarketing. It is another form of research, understanding how and in what way we react to marketing actions thanks to neurosciences. We ask the user and observe how they use our web / app to create hypotheses for improvement, but we still do not understand the real why of those decisions / actions.

If you couldn’t work in Experimentation or UX, what would you do?

Data visualization, which is another of my great passions.

Describe Angeles in 5 words or less.

  • Creative
  • Analytics
  • Persistent
  • Flexible
  • Empathic

Amazing. Thank you for chatting with me!


Connect with Experimenters from around the world

We’ll highlight our latest members throughout our site, shout them out on LinkedIn, and for those who are interested, include them in an upcoming profile feature on our site.

You may also like

Tips on running a Test-to-Learn Experimentation Program with Shiva Manjunath

Shiva will run through WHY you want to be running an experimentation program based on insight generation and not simply Read more

Making product decisions with bayesian analysis

Positive John dives into the suitability of Bayesian inference for product decisions.

Conversation Starters of the Week – October 14, 2021

Every week, Rommil Santiago shares interesting LinkedIn posts that he runs across to help spark ideas and conversation. Here are Read more

Seven lessons from running 22,000 AB tests with Ayat Shukairy

In this session, Ayat Shukairy will share the lessons learned by analyzing over 22,000 AB tests on 700 CRO projects. Read more