Polestar’s Oliver Wyman on building an Optimization team and Experimentation culture

Home / Interviews with Experimenters / Polestar's Oliver Wyman on building an Optimization team and Experimentation culture

Hi Oliver, thanks so much for taking the time to chat with us! How have you been?

Hi Rommil, It’s a pleasure! I’m not going to lie, it’s been an intense past 12 months with Covid, moving cities, beginning a new role and becoming a father, but ultimately it’s all been incredibly humbling and rewarding.

I hear you. Congrats on becoming a father! Amazing! So that was the last 12 months, let’s hear a bit more. Could you share with us what is it that you do and a bit about your career journey up to this point?

Sure thing! I’m currently the experimentation lead at Polestar, an EV brand founded by Volvo and Geely. I work as part of an insights team tasked with socialising research and analysis throughout the organisation in order to drive digital customer touch-point design. Prior to becoming enamoured with experimentation I was a marketer working in renewable energies and finance.


Nice – sounds like a cool gig. As someone who’s built an Optimization team, can you share with us how you go about deciding what skills you need on the team and how many people you should hire?

Polestar is a direct-to-consumer business and when I joined the entire infrastructure for this model was still being built. As you can imagine the digital product teams at the heart of this mission were at full capacity. For that reason we decided on forming a small team including myself, a front-end developer, a UX designer and an analyst to establish an experimentation program to exemplify what was possible with this methodology to the rest of the organisation. We were offered broader support, but this often came at the expense of diminishing returns. From this experience, I advocate balancing having as broad a capability as possible, in as small a team as possible. We then adapted this approach as buy-in and uptake of experimentation grew within the organisation to prevent expertise hoarding.

"I advocate balancing having as broad a capability as possible, in as small a team as possible." – Oliver Wyman Share on X

I like that. Keep it efficient. How do you go about measuring the performance of your team?

I would say we use both quantitative and qualitative measures.  I like to use win rate as a proxy for the health of our process (including research for example), aiming for between 30-40% to ensure our ideas aren’t just no brainers. Velocity is important to me given that we’re trying to broadly advocate a methodology and at low cost. The number of digital product teams assisted is also important, so that we can influence culture on an enterprise level. Implementation rate will be more important in the future given that we’re more focused on the adoption of experimentation and optimisation practices for now. That process in itself will need to be optimised! In terms of qualitative measures, this is a bit less tangible, but you can observe the impact the global insights team has had on the organisation based on the questions and data requirements teams are posing on themselves and incorporating into their decision making process.

That’s a great thing to see – when folks start making Experimentation a part of their process – it’s just beautiful.

Because Optimization is a team sport, how do you ensure that Optimization is connected with and leveraged across functions? Do you have any advice for new Optimization teams to nurture such connections?

We have a format for connecting with individual product teams, which includes an insights presentation followed by a hypothesis workshop to kickstart experimentation. What we found is that the interest for our insights was much broader than the original product team. So, we take those insights on a small road show internally. More often than not this generates demand for the same process to be repeated with other teams. I’m very fortunate to be surrounded by highly capable analysts and researchers to be able to do this. For those looking to nurture connections and build an optimisation program from scratch with fewer resources I recommend focusing on a single product team. Form a relationship with the PM/PO in order to have resources directed towards experiments and build out a use case of several tests before demonstrating the learnings to a wider audience. You can then use that case as the basis for lobbying for more dedicated resources, or stir up enough jealousy to be invited into another team.

For those looking to nurture connections and build an optimisation program from scratch with fewer resources I recommend focusing on a single product team." – Oliver Wyman Share on X

I love that approach. Take one area, make them the example to copy for others. I agree, much better than trying to boil the ocean. Just so much more efficient in terms of adoption.

Speaking of adoption, data-led or evidence-based decision-making, for us in this field, is a no-brainer. Why do you think that approach isn’t as well adopted as we would hope and do you have any suggestions for bringing folks around?

A lot of the usual suspects come to mind; hippos, centralised decision making, stubbornness, lack of trust, and cultures that reward output as opposed to outcome. But, to be honest, I think fear is the biggest culprit. Fear that the direction of a product or a brand or a company is dictated by its customers and their behaviour. To bring folks around I recommend evidence and stamina. It can take years for a company to truly adopt experiment methodologies at scale, and you will get knocked down more often than not at first. Adopt a Rocky Balboa mentality and prepare to go all twelve rounds.

Ding ding ding. Haha. Changing gears. As a former Growth Hacker, can you tell us what the difference is between Growth Hacking and CRO?

Currently, I’m situated in the CX department and I think this lends itself well to CRO/experimentation. We conduct research and provide analysis to our digital product teams to enhance customer journeys. As a growth hacker (I’m not enamored with the title) we were situated within marketing and largely funnel-focused. In short, empathy was not as large a part as it is in my current role and organisation.

I dream of a world where growth hacking cares more about the customer experience. I feel that would help their cause a great deal – and that’s me, a former growth person, speaking. Haha

Finally, it’s time for the Lightning Round!

Are you a Bayesian or a Frequentist? 


If you couldn’t work in Experimentation, what would you do?

My COVID past time has been pizza making, so I’ll say Pizzaiolo. 

Describe Oliver in 5 words or less.

Inspired by peers like Rommil.

Ha, and I’m inspired by conversations with leaders like yourself!

Who has inspired you the most in this field so far?

I’ve taken a lot of inspiration from product management practitioners/educators like Alex Cowan, Dan Olson and Gibson Biddle. Directly within I’m a fan of people like Ton Wesseling who have been really tireless in supporting the community, and I’m really appreciative of the work you are doing with Experiment Nation. Listening to the podcast, connecting with your guests really makes me feel seen.

Who should Experiment Nation chat with next?

I don’t know them personally, but Spotify has an established CRO program and I know they are looking to grow this culture even more in the coming years. It would be interesting to hear about the journey to date and where they’re heading.

We’ll see what we can do. Oliver, thanks so much for chatting with me today!

My pleasure, and look forward to listening to bumping into you in the Experiment Nation slack group!

Connect with Experimenters from around the world

We’ll highlight our latest members throughout our site, shout them out on LinkedIn, and for those who are interested, include them in an upcoming profile feature on our site.

Rommil Santiago