Growth #10: How We Think About Product Testing At Drift

Media Thumbnail
  • 0.5
  • 1
  • 1.25
  • 1.5
  • 1.75
  • 2
This is a podcast episode titled, Growth #10: How We Think About Product Testing At Drift. The summary for this episode is: Today's episode of Growth is all about testing. Matt Bilotti is joined by Drift's growth tech lead Vignesh Mohankumar. Together Matt and Vig discuss the ins and outs of testing products. How should you test the products you're building? When should you test? And equally important – when shouldn't you test? Matt and Vig dig into the considerations you need to keep in mind when testing products and they share real-life examples from building and testing products at Drift. Don't miss out.

Matt Bilotti: Hello and welcome to another episode of hashtag Growth. I'm your host, Matt Bilotti and today I'm super excited to be joined by Vig Mohankumar, who is the tech lead on the Growth team here at Drift, I've realized that him and I have what could be podcasts episodes, basically every single day-

Vig Mohankumar: they should be recording us. crosstalk put these mics next to our desk-

Matt Bilotti: Yes.

Vig Mohankumar: And then have it live streamed.

Matt Bilotti: I actually think that would be good.

Vig Mohankumar: That would be good. I think people would watch.

Matt Bilotti: Yeah, so I figured why not just have Vig on the show and we'll have one of these conversations about a topic that we've been going back and forth on a lot lately.

Vig Mohankumar: Right. So This is my first podcast, so I thought it would be good to have a piece of paper and a pen to make it look like I'm a veteran of the podcast world, but there's no notes on here, so I won't be using-

Matt Bilotti: There's no notes at all. So, especially for all the audio listeners out there, he does have a-

Vig Mohankumar: Really have a pen.

Matt Bilotti: There's no notes.

Vig Mohankumar: No notes.

Matt Bilotti: Probably not going to write anything.

Vig Mohankumar: No.

Matt Bilotti: That's fine.

Vig Mohankumar: Handwriting's real bad.

Matt Bilotti: Okay. So me go ahead and jump into the topic. We've been having a lot of discussions around, call them arguments, if you will around, should we test this thing or should we not? Right. It's if we're going to test it, it's going to take a week, but we're pretty sure that it's going to work, so why are we going to test it in the first place? And so I just want to toss that out there as a starting point.

Vig Mohankumar: Yeah. This is definitely the hardest part of getting started with basically growth, which is, especially if you come from a product background, especially in an early stage company where like, we were here for a good amount of time, you usually measured on like, okay let's just get this thing out.

Matt Bilotti: Yep.

Vig Mohankumar: We have a channel called Shipyard where every engineer once they put something out on production, you just post in the channel saying," Here's what I shipped. And here's how it works. And here's where it is." And especially for me, I would just measure myself on," Okay, I got five shifts this week, six shifts this week." That would be like my high score count kind of thing.

Matt Bilotti: Yeah. That's it, I'm the best engineer look at all this value.

Vig Mohankumar: Yeah.

Matt Bilotti: It's all working.

Vig Mohankumar: Yeah, some of my friends I'd be like," Yo, I got five more shifts than you this week." Very odd diss. Yeah so-

Matt Bilotti: As the company grew and we started moving into this growth stuff.

Vig Mohankumar: Yeah. We started working on this about a year and a half now? A year- ish now.

Matt Bilotti: Yeah.

Vig Mohankumar: It's really tricky because you get to this place where you're like," Are the things I'm putting out, they're actually going to, or actually are they doing anything? Are they actually working?" And that's definitely the reason to AB test it fundamentally, that's why. But yeah, it's kind of like a big question around like when should you do it and when should you not? So it's a good point that we're talking about.

Matt Bilotti: Yeah. Let's dig into that. When should we test and when should we not? Should we not test when we sit around the room and say we're 90% sure, right before we even run anything, we're just, our intuition tells us we're so certain that this thing is right.

Vig Mohankumar: So intuition is good-

Matt Bilotti: Do you test it?

Vig Mohankumar: So you have to start with the hypothesis, so like scientific methods, so you have to start with, okay, why am I doing this? And what do I think it will actually do? So if you shift a new button, you have to have some reason for what impact it's going to have, so any good experiments starts with that, so that's good.

Matt Bilotti: Yeah.

Vig Mohankumar: So the thing you're talking about after this is like your intuition of," Okay. I believe that this thing is probably going to work because I'm a smart person and I'm not going to be dumb enough to put something out there that's not going to work."

Matt Bilotti: Yeah. crosstalk, how could we be that dumb?

Vig Mohankumar: How stupid are we? There's some level of ego here.

Matt Bilotti: Yeah.

Vig Mohankumar: Because it's like, I've been doing this for awhile, I'm probably not going to make a mistake.

Matt Bilotti: Yeah.

Vig Mohankumar: But I think AB testing has been like a tool for humility for me-

Matt Bilotti: That's funny-

Vig Mohankumar: Fereid from Slack told me this one, he was like," It's a tool for humility to make sure that you actually realize what you're doing. And like are those that those things are going to work."

Matt Bilotti: Yeah.

Vig Mohankumar: So basically you can think about that way of sometimes you need to make sure basically that what you're thinking, your intuition, your hypothesis are fundamentally right.

Matt Bilotti: Yeah.

Vig Mohankumar: But that affects what you're thinking about going forward too.

Matt Bilotti: So for me, there's like a point in which you have to start AB testing, which is when you obviously have product market fit.

Vig Mohankumar: Right.

Matt Bilotti: You have enough volume that you can reasonably get results in a while. There are parts of our funnel that we would love to test stuff on.

Vig Mohankumar: Yeah, it'd be awesome.

Matt Bilotti: We would love to, but if we ran something, it would take us two and a half months to actually get significance.

Vig Mohankumar: Yeah.

Matt Bilotti: So that is a crosstalk scenario. Right, so that's if it's even working. So that's a scenario where we could probably look at this and say, t's going to take us three and a half weeks to get a result on this is the potential result actually going to impact the bottom line enough that that thing is worth testing or do we have enough background and understanding of the customer that this is like clearly better customer value?

Vig Mohankumar: It's worth explaining that. So I think if you're so far in the funnel that, or if you don't have enough volume to the point where an AB test isn't going to get significance, it probably means you probably shouldn't be working much in that part of the funnel because people aren't getting there. If people aren't getting that far, is that really where you should be working in a place where only five people see it a month? I think it's sometimes easy to get trapped in like," Oh, I really need to ship stuff here."

Matt Bilotti: Yeah.

Vig Mohankumar: But if no one's getting there and they're getting stuck at your signup flow, like activating at the very top, for us that's what installing Drift.

Matt Bilotti: Yep.

Vig Mohankumar: Like putting drift on your website, having a conversation. There's no point in fixing the settings page to make it so like it's clear to change your color if no one's even getting through to the point of seeing the settings page.

Matt Bilotti: Right.

Vig Mohankumar: So that's kind of like the first thing to look at is like how many people are getting here? What's the volume here? So I think that's definitely the first question to ask.

Matt Bilotti: Yeah. And that's to make a decision of testing or not, you still might have a core product team-

Vig Mohankumar: Right.

Matt Bilotti: That's going to fix the settings-

Vig Mohankumar: Yeah right.

Matt Bilotti: Because you're 20 customers, if you're still a young company, you're 20 customers-

Vig Mohankumar: Yeah. I can't change the color.

Matt Bilotti: Right.

Vig Mohankumar: This is more for like a growth team.

Matt Bilotti: Yeah.

Vig Mohankumar: If you're on a team that's trying to really work on distribution or getting volume through the funnel, you got to think about like how many people are getting there in the first place.

Matt Bilotti: Yeah.

Vig Mohankumar: That's the first question.

Matt Bilotti: Okay. So let's take an example that we'd recently been discussing. So we're doing some experimentation on the onboarding right now and so the designer on our team was looking at one of the steps and said," All right, this step could be much easier if we..." Just for simplicity sake, let's say that there were two options of things to do and they were side by side and she said," why don't we put these on top of each other because that'll just make it easier and it's clearly people work from top to bottom when they're kind of filling-

Vig Mohankumar: And I won't lie, I looked at the thing, I was like," This is so much better." Amanda's a great designer, so I looked at it and I was like," How did we ever not have it this way in the first place?"

Matt Bilotti: Yeah. So do we test that?

Vig Mohankumar: It's tricky one, because in some cases I would say" No." Because how could this possibly be worse? But I think for us, we needed to test it because we didn't know if, we didn't realize if there was actually a ball neck in that first step, even if the new step was simpler and better, that's totally fine and good, but it may not actually move the needle, it may not get people further through your signup flow. So for us, it was important to test it, to try to understand, okay, is this better at all? And if it is better, how much better is it? That's kind of the reason we did it. So then we can use that learning going forward. So I think a good way to think about this is like, what would you do based on different scenarios? Let's say we learn that after two weeks the results are the same, what would we kind of do there? In this case, we would probably still ship it regardless, because even though it means that we're not getting more people through, at the end it's still a better experience, so people are going to be less annoyed, that's fine. If it's better, if it's 10% better-

Matt Bilotti: Great.

Vig Mohankumar: Sick. We're doing our job-

Matt Bilotti: Nice job.

Vig Mohankumar: Awesome. Now, but then there's like, if it's 10% worse, what are we learning? What's the learning there?

Matt Bilotti: Yep.

Vig Mohankumar: Okay. Then the question comes, was the 10% drop worth that better experience for those other people?

Matt Bilotti: Right.

Vig Mohankumar: So to me it kind of comes down to how much volume is there at that part of the funnel? If this, I mean, this was like step two of the onboarding so most of the people giving us their email on the website, we're seeing the step that a very small change has a pretty big impact down the line and so a 10% change when there are two people a day seeing something, what? That's like a handful of people per month.

Matt Bilotti: Yeah.

Vig Mohankumar: Maybe it's worth the 10% hit to give a better experiences to those people that-

Matt Bilotti: There's the other side of the learning too, which is, we learned that it's the same.

Vig Mohankumar: Yeah. I think the learning there is, even if it's a very experience, we only have like so many people that can, we're not like Facebook, we don't have like a million engineers-

Matt Bilotti: Yeah.

Vig Mohankumar: Where we do whatever we want. So we really have to think about like, is this the best type of experiment we should run going forward? So if we learn that merging steps together isn't getting people further in the, we probably shouldn't do another merge step-

Matt Bilotti: Right.

Vig Mohankumar: For a different part of the funnel later on, so that's a good learning to have. So depending on how many types of experiments you're doing and what types of experiments you're running, if you're going to try to do similar things across your funnels, it's worth experimenting to get the learning out of it. So that's why I think about like, okay, should you, what if it's worse, what if it's the same, what if it's better? Just generally, what would you do? But there are cases where even if it's worse or better, or the same, you don't care and you're going to ship it anyways. Let's say you have a bug, right?

Matt Bilotti: Yeah. Great. Should you AB test fixing the bug?

Vig Mohankumar: Please don't do that. Please don't AB test because it just ruins the name of AB testing if we get to the point where it's like, I have to AB test this bug fix.

Matt Bilotti: Right. The thing was broken, but maybe because it was broken and made it harder and it qualified it-

Vig Mohankumar: Right, yeah.

Matt Bilotti: The people better.

Vig Mohankumar: Yeah. Or it made them mad and then by making them mad, they like really rushed through the rest of the-

Matt Bilotti: Yeah.

Vig Mohankumar: I don't know.

Matt Bilotti: This is where you've gone too far.

Vig Mohankumar: Gone too far.

Matt Bilotti: You've taken the concept and just-

Vig Mohankumar: You're in fifth order of thinking-

Matt Bilotti: Yeah.

Vig Mohankumar: You're in this metal world.

Matt Bilotti: Right. At the end of the day, don't make a bad experience for customers-

Vig Mohankumar: Yeah, your inaudible sucks if you have, your thing is broken, you know what I mean?

Matt Bilotti: Yeah.

Vig Mohankumar: What about like an example? You sign up with an email and it gets you through the funnel, but it actually changes your email in the middle, so say like Matt at Drift signs up and then someone else is like," I'm also Matt at Drift." Or something and then it changes it's email. So you get through the rest of the funnel, but you don't realize that your email wasn't the one you signed up with-

Matt Bilotti: Yep.

Vig Mohankumar: For some reason.

Matt Bilotti: And now it's a different-

Vig Mohankumar: Now it's-

Matt Bilotti: Then the next time he tries to sign up-

Vig Mohankumar: Yeah, I'm just trying to explain, that's a case where people are getting through the funnel, so it's a broken experience.

Matt Bilotti: Right.

Vig Mohankumar: So regardless of what the end result is, it's wrong. So you should not AB test that. You just fix the problem, maybe not the best example, it's hard to think of stuff on the top of your head.

Matt Bilotti: Yeah. Metaphors are tough.

Vig Mohankumar: Metaphors are tricky.

Matt Bilotti: So it's interesting, then it begs the question of what we were talking about at the very beginning of the podcast was, when you're early on and you're building product and maybe you've been building products for 10, 15 years, you build an intuition.

Vig Mohankumar: Sure.

Matt Bilotti: And now what we're saying is, you can be running a lot of tests and you learn from those tests and then you learn that, okay maybe merging the steps isn't helpful. Now, do you have an intuition or should you go test that again the next time?

Vig Mohankumar: I think it depends on your scenario.

Matt Bilotti: Yeah.

Vig Mohankumar: If it's a case for... You probably should just test it again, because you don't, like unless you have like, I guess this goes back to what we were talking about at the beginning. If it's going to take two weeks to test this, and you're pretty certain that this is going to make an improvement in your funnel, you can use your learning, you had last time of saying," Okay, merging two steps together is probably not going to move it let's not make that change because it's not going to make a difference." So you can use the learning that you had in the higher part of the funnel, kind of like later stage, I would say that's how I would think about it. But sometimes it's not completely transferable.

Matt Bilotti: Yeah.

Vig Mohankumar: That's kind of the problem, you have to use some amount of common sense to figure out, okay, the learning I had here, does it apply somewhere else?

Matt Bilotti: Yeah.

Vig Mohankumar: An example is like, let's say you get, so you have a product that has an onboarding flow in one product and your company also has a different onboarding flow, that's the kind of case where if you have similar traffic to both, you can kind of use the learnings in one for the other, I would say.

Matt Bilotti: Yeah. Yep.

Vig Mohankumar: That's kind of the example I would use at least.

Matt Bilotti: Makes sense. So I want to change tack a little bit on this. One of the other things that we've run into is let's say we're working on the onboarding, we want to help people get through, more people get through, do we test a bunch of changes at once?

Vig Mohankumar: Oh yeah.

Matt Bilotti: Or, yeah, this is another one that we talk about a lot, let's say we want to change the color of this button, like, we're going to do a full pass on onboarding, we're going to change color this button, we're going to merge these steps, we're going to move this option here, we're going to change the way the fields look, is that the test? Changing all these things at once?

Vig Mohankumar: Yeah.

Matt Bilotti: Or do you change each individual thing and wait for results on each of those?

Vig Mohankumar: So for context, our old onboarding was built July 2016.

Matt Bilotti: Long time ago.

Vig Mohankumar: Long time ago. No one was working on the onboarding for awhile and then we were like," Wow, we should really make this thing, we should just redo it. Honestly." There's no way it made sense to copy and paste it and AB test each part out, it would have taken us two years and then we would have been fired and we wouldn't have jobs and we wouldn't be talking about this here.

Matt Bilotti: Nope.

Vig Mohankumar: So I think that's the first step is if you AB test this, will you get fired? Maybe that could be the first thing you think about, but yeah, no, it's a fair point because it's like, we had a designer on the team and her idea was basically we have consistency and design of what we do across the product. We have certain buttons that we use, we have certain colors and themes and font sizes and inputs, should we AB test every single one of these things out?

Matt Bilotti: Individually.

Vig Mohankumar: Right.

Matt Bilotti: Yeah.

Vig Mohankumar: To make sure it doesn't have a negative impact. And the answer there was no. We ended up just saying," Let's just combine, let's just do it out. Let's let's try and do like a one- to- one transfer, let's do the best we can, but let's make the upgrades that we need to make." So we're using an old input style, lets use the new input style, using a green that doesn't exist in our color palette anymore, let's change the green. But we tried not to do anything that would affect the flow.

Matt Bilotti: Yep.

Vig Mohankumar: So we didn't say like," Okay, let's just get rid of these two steps." Lets change-

Matt Bilotti: We didn't change the text on the CTA.

Vig Mohankumar: Right. We kept the T the CTA text the same, because then you get into a really tricky situation, which is like, is this going up or down because I changed the style?

Matt Bilotti: Right.

Vig Mohankumar: Or because-

Matt Bilotti: Is it the-

Vig Mohankumar: I changed-

Matt Bilotti: Text?

Vig Mohankumar: The text?

Matt Bilotti: Or is it the way that the page is structured?

Vig Mohankumar: Right.

Matt Bilotti: Yeah.

Vig Mohankumar: So you got to do your best to try to keep that when you have situations where you can't AB test every single thing, do your best to do everything as a one- to-one-

Matt Bilotti: Yeah.

Vig Mohankumar: AB test the whole thing, just to make sure that nothing's gotten worse and then yeah you... After that, the question comes up of like, if I have one step where I change the CTA and then the next step, I want to merge two steps together after that.

Matt Bilotti: Yeah.

Vig Mohankumar: Does me changing the CTA have an impact on what they do-

Matt Bilotti: Right.

Vig Mohankumar: After that?

Matt Bilotti: Right.

Vig Mohankumar: Becomes a problem.

Matt Bilotti: Yeah.

Vig Mohankumar: We at Drift have kind of avoided, we've tried to avoid this basically. We have parts of the funnel of saying like," Okay, people sign up." So all the website experiences, we isolate that, I would say, anything that happens on the website, isolated. Then they come to the onboarding, anything from onboarding to finishing it, I would say, like installing and then getting to the dashboard is a second layer. We try not to do two tests there.

Matt Bilotti: Yep. Yeah.

Vig Mohankumar: At once. And then once they get into the dashboard, the dashboard experience would be a third layer. So we can usually have three experiments at once basically and you can have lots of different website pages, so we usually have at least four or five experiments running at once.

Matt Bilotti: But there are on very distinct parts crosstalk of the funnel. Yeah.

Vig Mohankumar: Yeah. But once you get like much more scale, you can randomize it.

Matt Bilotti: Right.

Vig Mohankumar: And then you can-

Matt Bilotti: Like a Facebook can do or a Pinterest can do?

Vig Mohankumar: Yeah. And then you can kind of like, it's called normalizing it. So you would just try to understand what impact does this improvement have on affecting other AB tests that are running.

Matt Bilotti: Right.

Vig Mohankumar: It gets a little complicated, so we're just trying to avoid it because we're not at the stage where we're changing this copy and this copy, we're trying to do big bets anyways, usually, so it hasn't really been a problem for us at least, yeah.

Matt Bilotti: Yeah and it comes back to the whole concept of bite- sized changes in big swings which is, I talked about a couple of episodes ago of, do you make this big change where we haven't worked on onboarding in a while, do we make a big change to it and see if that's the new normal? Or do we make the small incremental changes and see how those add up or return?

Vig Mohankumar: Right.

Matt Bilotti: Cool. I think that's it for this episode.

Vig Mohankumar: Sounds good.

Matt Bilotti: Yeah. How many stars should the listeners and viewers rate-

Vig Mohankumar: Is that a 10?

Matt Bilotti: This episode? No, it's at a 5.

Vig Mohankumar: At a 5? Oh.

Matt Bilotti: Yeah.

Vig Mohankumar: Probably a 10 still-

Matt Bilotti: 10 star?

Vig Mohankumar: Yeah. Just read it twice.

Matt Bilotti: Yeah.

Vig Mohankumar: Yeah.

Matt Bilotti: Yeah, that's good. Probably not allowed, but at least 5 stars.

Vig Mohankumar: At least five, yeah.

Matt Bilotti: Cool. All right, Vig thanks for joining today and thank you for listening. If you have any feedback, thoughts, ideas, whatever it might be, send me an email matt @ drift. com, we'd love to hear it and-

Vig Mohankumar: You respond to those, right?

Matt Bilotti: Yeah, I respond to those.

Vig Mohankumar: Nice.

Matt Bilotti: Yeah.

Vig Mohankumar: You don't have a hired person that responds?

Matt Bilotti: No.

Vig Mohankumar: Oh, nice, cool.

Matt Bilotti: I'm not that important. All right, thanks for listening. See you on the next episode.


Today's episode of Growth is all about testing. Matt Bilotti is joined by Drift's growth tech lead Vignesh Mohankumar. Together Matt and Vig discuss the ins and outs of testing products. How should you test the products you're building? When should you test? And equally important – when shouldn't you test? Matt and Vig dig into the considerations you need to keep in mind when testing products and they share real-life examples from building and testing products at Drift. Don't miss out.