Publicerad januari 21, 2013

Kyle Rush on Optimizely, Testing, and Obama [Part One]

We interviewed Kyle Rush of the 2012 Obama for America Digital Team about A/B testing at the campaign. 

Cara Harshman
av Cara Harshman
a group of men sitting at a table with laptops

We interviewed Kyle Rush of the 2012 Obama for America Digital Team about A/B testing at the campaign. 

What has your relationship been like with A/B testing? Did you do any testing before joining the Obama campaign?

Before the campaign I was working at Blue State Digital, where I kind of got my feet wet with A/B testing. I had done a few tests with client work there, which was really interesting for me. I think that really sparked my interest in A/B testing. Once I got to the campaign we had so much traffic that you could basically test whatever you wanted, however you wanted to do it.

How was the campaign’s digital team structured? How many people were actually testing day-to-day?

Total on the digital team was about 165 people. I was part of the 20-person frontend engineering team. There was an analytics team, an email team, a blog team, a video team and an optimization team made up of members from each digital team. We used A/B testing on almost every team’s content. The frontend engineering team basically took care of implementing most of the A/B tests just because it required some knowledge of JavaScript. The cool thing about Optimizely is it’s so easy to use that the non-technical guys on the analytics team could use it also. After a few hours of teaching them a little bit about JavaScript, they were off writing their own copy tests in Optimizely. That was huge for us because it allowed us to focus on bigger tests.

How did the insights and results from the testing Dan Siroker did at the 2008 Obama campaign inform your work in 2012?

a man with his hands in his pockets in front of a group of people

Rush greeting the President.

I think the biggest lesson we learned from 2008 was how important A/B testing was and how intense of a program we needed to have. I am really into Obama and digital so I have been following the campaign since 2008 and I definitely remember reading Siroker’s blog post on the homepage testing and how it raised a huge amount of money. If we didn’t have that information published from 2008, then it’s possible we would not have known just how mission-critical A/B testing was. 

We know you ran about 500 A/B tests on the campaign website. How extensive did those tests reach across the site?

There really wasn’t a long stretch of time where we didn’t have an A/B test running on a page. About half way through the campaign we had to deliver some strategic goals for A/B testing to our superiors. AB testing tools are so humbling so it’s very difficult say something like, “our goal is to raise conversions by 50% by this date.” You just don’t know. So one of the strategic goals we came up with was to always have an A/B test running somewhere on the site. We actually even divided that up later into sub-sections.

Always be testing, that’s what they say.

Yes, totally. There is really no reason not to. One of the coolest traffic days we had was when the Supreme Court decision came down on Obamacare. None of us had ever seen a traffic spike like that. Since we worked to always have a queue of tests running or lined up for the donation pages, we had five A/B tests ready to go that day when we walked into the office. All we had to do was start the experiment in Optimizely and be off. Once the decision came down, the traffic surge was just crazy. Normally it would take us at least a few days to go through that many A/B tests to get statistically significant results, but that day it happened in minutes. We blew through our whole queue. We ran over to one of the developers who was implementing A/B tests that day and just started thinking up tests on the fly – whatever we could think of, we would test it. And so we changed the background color on the donate form to like 80 different things, which ended up not making a difference. I am very happy that we learned early on that there wasn’t a lot of ROI in changing the background color.

What are a couple best practices in A/B testing that you’d pass on to someone?

I’d say number one, use ROI (return on investment) to prioritize your tests. That’s one of the biggest things I’ve learned in my career. Make decisions on your tests based on your previous results and what you’ve learned from other people on the Internet. At the Campaign, we could have started with severely altering the design of our pages very early on, but that wasn’t producing the highest ROI. We found out that’s something you want to do after your page is already super-optimized.

Number two is always be running an A/B test. If you are really serious about your A/B testing operation, then there’s no reason why you wouldn’t. You might have some instances where a variation bombs and reduces conversion rates, but you have to be willing to take that risk.

For a third one I would say stick with solid hypotheses. What a lot of people will find when they first get into A/B testing is that they just want to test everything, even if it’s not going to be beneficial. There were several instances when people would ask us to test things that wouldn’t tell us anything in the long run. So it was super-important to have a scientific approach. We come up with a hypothesis that’s very clearly defined and then we figure out how the hypothesis can help us. I think it all fleshes out people’s ideas a little bit better. Although, sometimes you might need to explain what science is to people and help them understand how a hypothesis works.

Obviously a recommendation of mine is to use Optimizely. I don’t know if that’s a best practice, but Optimizely is so good at improving your ROI because it’s so easy to use. I know I sound like an Optimizely fan boy, but when you’re a developer in the thick of it and you really just need a tool that works, Optimizely is just where it’s at. We could have spent a lot of time changing our architecture to fit others tools and needs, but Optimizely made it super easy.

Read part two of the interview where Kyle talks about the future, surprising test results and big data.

Interview by Cara Harshman

INTERVIEW HAS BEEN CONDENSED AND EDITED.

https://pixel.welcomesoftware.com/px.gif?key=YXJ0aWNsZT0zMDFiNjlhOGVhYzcxMWVlYjUwZDRhOWExOWEzYWE4NA