Autodesk’s Heather Voden on scaling marketing through Experimentation

Heather Voden

A Conversion Conversation with Autodesk’s Heather Voden

Heather and I go way back. I had the pleasure of working with her almost a decade ago back at Autodesk. If you’ve never heard of Autodesk, I’m sure you’ve heard of their best-selling product, AutoCAD. In fact, one could comfortably say that most of the things around you probably were designed by their software.

Heather and I were early members of a small and scrappy team of marketers trying to grow a direct-to-customer eCommerce business from the ground up. It was new territory for Autodesk and we learned a lot and bonded through common suffering. But what I’ll always remember about Vodie (as I liked to call her) is her positivity?—?it was contagious.

I recently caught up with her to chat about how much Autodesk’s in-trial program has grown over the years, how they leverage Experimentation to deliver results, and her love of wrestling. Oooooh Yeeeah!


Rommil: Heather! It’s so good to catch up with you?—?it’s been far too long! I miss you and the eStore gang over at the ‘desk. How’ve you been?

Heather: We miss you too! All is good over here. Hard to believe I’ve been at Autodesk 10 years now?—?time flies when you’re having fun ?

“Our program has grown a lot since we worked together. We now manage about 40 different product trials in 25+ countries.”

Let’s start off with you reminding me what do you over at Autodesk?

I’m a Marketing Manager working as a part of the Digital & eCommerce group. My team’s focus is trial marketing?—?more specifically marketing to those people who are “in trial” or recently had a trial expire with the goal of converting them to paid subscribers. And as you might have guessed, we’re pushing for these conversions to happen direct through our eStore. Our program has grown a lot since we worked together. We now manage about 40 different product trials in 25+ countries.

Autodesk makes AutoCAD. And knowing is half the battle.

Wow! That’s incredible! Clearly that success is all due to your efforts.

Ha! Maybe at the beginning when it was just me I could claim that. Now there are 8 of us and it truly is a collaborative effort.

So, more importantly, do you miss me “a lot” of just “immensely”?

Ummmmmm…a lot? Errr, I mean most definitely immensely!

“We like to ensure we see the same winning experience?—?or at least ensure we’re not harming the experience?—?before any global changes are made.”

All joking aside, can you share with us how you leverage Experimentation in your day-to-day in terms of in-product as well as outbound comms?

Our team primarily focuses on marketing to trialers through web-based messaging within the product (in-trial marketing window) as well as through emails. We see a positive correlation between trial usage and conversion, so we focus on providing the right content to trialers at the right time, with the goal of maximizing product usage during the trial period. Right now, the majority of our experimentation efforts focus on our emails. We’re always working to optimize our program through A/B testing, including subject lines, content, design, promotions and drop timing. Our program is worldwide, so we often run the same test in a few key markets before rolling it out to other countries. We like to ensure we see the same winning experience?—?or at least ensure we’re not harming the experience?—?before any global changes are made.

With such a diverse audience, what are some of the challenges you face marketing in-trial and what kinds of workarounds do you leverage to overcome them?

As I mentioned earlier, we primarily communicate to our trialers through emails and the in-trial marketing window. A few years ago, our trials moved over to a new licensing platform which brought the in-trial window inside the product. This was really exciting for a number of reasons as it allowed us to better track trial usage and engage with a trialer directly in the product. Initially, this new platform didn’t support cookies so we weren’t able to do any A/B testing or personalization. We’re still working through some technical challenges with testing, so we work around this by doing launch-and-learn tests. It’s not ideal, but it still allows us to test and iterate.

As I always say, running a test is better than not running on, any day. Tell me more.

For example, we recently started testing how we communicate promo offers to trialers. We have an automated placement and recently started testing a more specialized, but manual process. Unfortunately, we weren’t able to A/B test the experience so we had to look at YOY and MOM figures with supporting data points. It’s a bit more work and sometimes the results are not 100% conclusive, but for the most part, we can get a good sense of what’s working and what’s not.

I hear you. Doing pre/post analysis is always rife with challenges. But you gotta do what you gotta do.

“If there is anything we have learned from experimentation, it’s that we should never assume we know the answer.”

Has Experimentation answered any key questions for you?

If there is anything we have learned from experimentation, it’s that we should never assume we know the answer. Before we launch any test, we have a hypothesis on the outcome and many times the results don’t prove it correct. We recently added a promotional offer to our trial email series and noted it in our Subject Lines, we figured, ‘they’re going to love this…open rates are going to soar!” Instead, they dropped across the board. That being said, some tests come out clear winners, like our drop timing test that increased our open rates by 35% when we changed our email deployment to local commuting times.

At the end of the day, we’re not our own target market, so it keeps us humble and always ready to find our next winner.

A common question in this field is, in terms of tech-stack, what are you folks leveraging to deliver these tests?

For emails, we utilize Marketo. To assess our launch and learn tests, we look at the data in Adobe Analytics.

As a publicly-traded company, are you ever concerned that a test may negatively impact your quarterly goals? If so, how do you manage expectations?

Sure, that’s something that we consider and since everything we work on is digital, we have the flexibility to easily turn things on and off. Whenever a new test is launched, we closely monitor it in case we see a negative impact to sales. But in the grand scheme of things our volume is still relatively small so the chances of one of our tests having a big negative impact is relatively small.

Your volume is small “for now”, I’d say. 10 years ago, who would have thought you guys would be running such an impressive operation? In another 10 years?—?who knows how much bigger it will get!

Finally, now it’s time for my favourite segment?—?the Lightning Round!

Only OG fans of the WWF remember this guy. Source: https://en.wikipedia.org/wiki/Brutus_Beefcake

Hogan or Macho Man?

Let’s go a little deeper?—?Brutus Beefcake (before he became “The Barber”) OMG did I really just admit this?

AutoCAD or AutoCAD LT?

AutoCAD! No one knows Autodesk until you say we make AutoCAD.

Who’s the better team: The Raptors or the Chiefs?

Seriously??? I’m not a big 49ers fan so I’ll default to the Chiefs, but you are cruel!

Describe Vodie in 3 words.

Laid back, problem-solver, humble

Vodie, so great catching up with you. Thank you for joining the conversation!



Connect with Experimenters from around the world

We’ll highlight our latest members throughout our site, shout them out on LinkedIn, and for those who are interested, include them in an upcoming profile feature on our site.

Rommil Santiago