How Intuit’s Mike Loveridge continuously runs experiments without wasting traffic

Mike Loveridge

A Conversation with Intuit’s Mike Loveridge about Experimentation

I recently spoke with Intuit’s Mike Loveridge about how he maximizes site traffic to run Experiments non-stop, the importance of quality over quantity, and how, for most of us, we’re looking to make decisions faster?—?rather than trying to measure the effectiveness of new medications.


Rommil: Hey Mike, how are you doing?

Mike: Doing great. Thanks for asking.

For our readers, could you share with us what is it that you do, and a bit about your career journey so far?

My training is in statistics and math. I was doing the marketing research for all the movie studios in Hollywood, and then political campaigns. Handled one of the US Presidential races, which was pretty cool. Got an MBA, did pricing for Intel on their server and workstation computer chip lines for a few years. I then moved into managing analytics departments and heard about conversion rate optimization testing and was hooked. Have been in the testing space ever since… running large programs at airlines/travel, SAAS, healthcare, and consumer products.

Very cool. I’ve read that you’ve dramatically increased the testing velocity at Intuit. That’s amazing! How’d you pull that off? Intuit is quite a large company.

You’re right, haha. It’s big. That bullet actually relates to one of Intuit’s verticals, the TSheets business. I wish I could have done that across the whole company because the concept is amazing. But, at TSheets, I did the math on maximum testing velocity based on traffic, conversion rates, and how long tests would likely need to run for each web page of interest. I set up the testing pipeline so that we could do ‘always on’ testing. This means that as one test ends, we have the next one ready to turn on… so no traffic is wasted on any of our top pages. If possible we’d also test orthogonally, running more than one test per page. The last part of this strategy relied on setting up rich data capture for each test. We’d track every clickable item on the pages, and utilize heatmaps. We’d look at this data while the test was running. If we saw obvious behavioural shifts, we wouldn’t always wait for statistical significance. We’d just make the call, turn off the test early, and move on. My thought is that we’re testing for marketing organizations; NOT pharmaceutical companies. So, it’s okay to move more quickly and with boldness.

“My thought is that we’re testing for marketing organizations; NOT pharmaceutical companies. So, it’s okay to move more quickly and with boldness.”

I’m a very big proponent on having valid data to make decisions. I love your practical approach. That said, how do you go about handling so many tests simultaneously?—?namely the overlap in traffic?

Planning. A ton of spreadsheets. Haha. Digital air traffic-controlling, if you will.

Because a lot of Intuit’s business is seasonal?—?how do you account for such dramatic changes in traffic and how do learnings generated from tax season impact the rest of the year?

I didn’t interface much with the TurboTax group, so not sure how they handle it for certain. For the simple answer on other areas of the business; seasonality is taken into account. Test duration changes with the seasons. Purchase intent, length of the sales funnel, etc. changes. This is accounted for. I wish it was all science but we’re always working on figuring it out more completely. The dynamics are intertwined with ever-changing macroeconomic factors.

During the off-peak season, I imagine you have to deal with low-traffic situations. What’s your approach to handling those?

Fewer tests. More time to plan for peak.

For companies looking to get into optimization/Experimentation?—?what advice would you give them in terms of team structure?

I like dedicated teams. They can be much more efficient. Get the best resources you can. Quality is much better than quantity. One great developer or designer is better than 3 or 4 marginal ones. Also, conversion rate optimization specialists are hard to come by. REALLY good ones are unicorns. So, take your time. Find the unicorn. Build around that. A great conversion rate talent is like a quarterback to an NFL team. It can make or break things. Interview. Be patient. Spend the extra money for top talent. You’ll be glad you did.

“Also, conversion rate optimization specialists are hard to come by. REALLY good ones are unicorns.”

100%. Changing gears a bit. Tell us about your most favourite Experiment. At least one you can share with us.

There are so many examples I’d love to mention. One simple one, in particular, had to do with testing the value prop in the hero banner on the homepage. We tested systematically and relentlessly for months to get it right. Made a HUGE impact on sales. One point, in particular, was around the use of the words ‘easy’ and ‘simple’ within that value prop. They seem identical, right? As it turned out users wanted a product that was ‘easy’ but not ‘simple’. ‘Simple’ felt like it was not a rich-featured product. ‘Easy’ made them feel like the product had all the bells and whistles but they could handle using it. Subtle. Never would have figured that out without testing.

As the old saying goes, “Words matter.” Finally, it’s time for the Lightning Round!

Favourite Experimentation tool?

Optimizely.

Frequentist or Bayesian?

Both.

If you couldn’t work in Experimentation, what would you do?

Adventure travel, photography, and write books.

Describe Mike in 5 words or less.

I’ll use 4. “Bite. Electrical cord. Baby.” (Haha. That could explain a lot.)

Mike, thank you for joining the conversation!

Rommil, thanks for the fantastic questions. This has been a lot of fun. Keep up the great work.



Connect with Experimenters from around the world

We’ll highlight our latest members throughout our site, shout them out on LinkedIn, and for those who are interested, include them in an upcoming profile feature on our site.

Rommil Santiago