02: Sprints & Features

We dive into what our sprint cycles look like at our companies and how we plan and build features. Aaron talks about GatherUp's new sprint format for 2019 and Darren updates us on a free review research tool they are almost ready to launch.

Read the full show transcript below as Aaron and Darren talk about how they run their sprints and release features in their products.

Helpful links from the episode:

FULL SHOW NOTES

[music]

00:11 Aaron Weiche: We are back with episode two, Sprints and Features.

00:16 Speaker 2: Welcome to the SaaS Venture podcast, sharing the adventure of leading and growing a bootstrap SaaS company. Hear the experiences, challenges, wins, and losses shared in each episode. From Aaron Weiche of GatherUp, and Darren Shaw of White Spark. Lets go.

[music]

00:44 AW: Welcome to the SaaS Venture podcast, episode two. I'm Aaron.

00:50 Darren Shaw: And I'm Darren.

00:51 AW: And we are excited to absolutely 100% what we're doing because we're going from one to two episodes, and that’s pretty exciting. I felt good about getting one episode in the books. I don't know how you felt about it.

01:05 DS: Yeah. I think that, okay, one in the books and now we're recording two, that tells me this is a thing. It's happening, we're doing it.

01:12 AW: Everything from here is hockey sticking up, and to the right.

01:17 DS: Yeah, it's amazing. Our growth, from zero to where we've come, has been pretty impressive.

01:22 AW: I've actually had some fun figuring out all these little wrinkles of podcast recording, and getting your feed submitted everywhere, and finding an editor and a voice-over talent, and all that kind of stuff, that was an interesting process to get to learn so much about a new medium. It's been a while since I've had to... It was like learning WordPress all over.

01:44 DS: And you've really done it. I've basically been sitting back and you just tell me when to show up. So, thank you, Aaron, for managing all that stuff.

01:51 AW: Yeah. Well, like I mentioned, that's a personal issue. I need to delegate more and do less, but I love to learn, so it's definitely exciting. But I probably need to hand some of it off to you to get you to feel involved.

02:03 DS: Sometimes you're gonna be busy, and I'll take the reins of organizing everything.

02:08 AW: There you go. So, what have you been up to the last couple of weeks since we talked last? 

02:11 DS: Well, we're trying to get this one tool out the door. We've been building this new tool, which we call Review Checker, and we were so confident. After last Friday, we're like, "We're definitely gonna launch this thing." And then on Monday, we start stress testing it and it's like, "Oh, yeah, oh right, there's this problem." That's the trouble, you always think you're done and then you just have a bunch of people poke at it, and you're like, "Wow, there's still so much more that we need to do before this thing is ready for prime time." 

That's been taking up all of our time, and I'm really excited about it. I think it's gonna be a great project. I think it's going to be a nice little tool that will bring lead gen into White Spark stuff. We have built that review link generator. Do you know that, on our website? 

02:54 AW: Yep.

02:54 DS: That thing gets so many mentions and tweets and links to it. It's a really great tool that drives a lot of attention to Whitespark, and so this will be something kind of similar.

03:05 AW: Now, do you guys have a formalized process as you're getting ready to release a feature or a new little tool like this that you follow very closely, or is it just like every day, see where you're at, see if it's done, next step kind of deal? 

03:20 DS: It's the latter. It's like we're always just picking away at it. So, it's Dmitri will finish the last round of edits, a request that I made of how I want it to look, and then I'll take a look at it or Jesse will take a look at it. It's basically been Jesse and I testing the tool out and writing up a list of things that we want different. And then he'll finish that round, and then we do it again and take another pass at it and think about it some more. So, it's been growing like that over the last little while. And then we had a bunch of our team members take a look at it and we found some problems. The biggest problem... First, I'll explain how the tool works. Basically, it just scans your business name. It'll search Google, it'll find any sites that have review schema with little stars that show up in Google search results, and then it will track that and give you a report, saying, "Hey, you have this many reviews on Google, and this is your rating. You have this many on Yelp, Facebook, Trip Advisor, etcetera." So, it just gives you this nice little report of how many reviews you have.

04:23 DS: And so if you were an agency on a budget, or a small business, and you just wanna keep track of this, you could just come back to the tool once a month or once a week, and run it and get a little report back, and you can say, "Okay, this is how our reviews are going." It's this free little once-off, check your reviews tool. In terms of the development, yeah, it's basically just been that we keep plugging away at it. And one of the problems that we're finding is, let's say you put in a really generic name, well, Google returns a whole bunch of stuff that's actually not your business. It's someone else's business, it's some weird business. So, if your name is really generic, you can get a whole bunch of what we are calling false positives in the tool, so we're trying to figure out a way to fix that before we launch it.

05:02 AW: Nice. How far into the results does it go? Is it mostly looking at page directories that would rank page one, or does it go further than that? What does that look like? 

05:10 DS: Yeah, we do the first two pages.

05:13 AW: All right. So, you're looking at, what, page one, and then page two, where the SEO joke is. That's the best place to hide a dead body, is page two of Google search results.

05:21 DS: Exactly. And we run about 10 different searches, and then we combine all the results, so we do a whole bunch of variations to try and find everything we could.

05:29 AW: Yeah, awesome. You're looking at this as more of a marketing and lead gen tool, at the same time you're building it out? Are other people on the team taking care of the marketing pieces, and how you're gonna promote it? 

05:41 DS: Well, promoting is pretty simple. We don't have a huge promotion plan. Basically it's a free tool, so it's this new little free tool. We're going to tweet about it, obviously, put it on our social stuff, we'll put it in our newsletter, and that will mostly be the extent of it. And we think, generally, my experience is, if you launch a free tool that is cool and useful, the marketing does itself. People will just share it because it's free, it's like, "Oh, this is a great new free tool." And we might do a little poking around people that talk about free tools and have free SEO tools. We'll make sure we get listed on those, so just a little prospecting and outreach for that. And then that's about the extent of it.

06:18 DS: The thing is interesting though because we have a number of clients... We have this one client in particular who is using Reputation Builder, which is our white labeled version of GatherUp, and they've got 70 of their locations in there, but across the US, they have 120 or something like that. There's some reason there's this other faction of the business that reputation builders overkill for them, they don't want to put them in our reputation builder, and it's like the whole process doesn't make sense for them, but they still wanna be able to track reviews. And so they were just asking me, "Do you recommend any inexpensive rank review tracking tool?" And I was like, "Well we have this little thing." And she's like, "Oh, well, if you could give me that in a monthly report, then we'd pay for it." So, I might actually put a subscription model on this thing. I might say, "In expense of $5 a location, we'll give you this report every month, and then we'll track it over time and give you a nice interface where you can see pretty charts and all that kind of stuff, and multi-location view." I am actually thinking about launching it as a paid tool, but we definitely can't do that until we fix some of these false positives and other problems with it.

07:23 AW: Maybe you look at something like you're allowed one scan per email address or something, right? 

07:29 DS: Yeah.

07:30 AW: You can limit their use if they wanna run it more, then they would need to switch to the paid version.

07:34 DS: Yeah. And we already have that actually, we limit it to three searches per day.

07:38 AW: Okay.

07:39 DS: It also does this cool thing, it calculates a review score for you. It's specifically a Google review score, we look at your primary category, and we compare it with all the other businesses and how many reviews they have and what their ratings are, and we combine that to generate a score based on what the average is and where you fit on that average, and so we report that back to you as well. It's pretty handy to get the sense of, "How am I doing compared to my competition?"

08:06 AW: How are you pulling competitors? Is that from others in the same category, regardless of location, others that have ran it in the tool, or are you actually finding a way to pull out others that might be in the Map Pack, against them? What does that look like? 

08:20 DS: Yeah. What it looks like is we pull their primary category from Google My Business, so basically... They'll select their business, we grab what we can find for them, and then we look at their GMB business profile, which used to be called the Knowledge Panel, and figure out what the primary category is. And then we take that primary category, and we search it in the local finder results and grab the top 60 businesses ranking for that term. And we parse all their data out, and then that's how we determine the average in your city. So, your average in your city... There might be 60 other hair salons, here's the top 60, these are their reviews and ratings, and this is where you fall in that.

08:58 AW: Got it. That sounds really awesome and really helpful. Can I send you a few people to run a test on? 

09:05 DS: Yeah, totally. I'll just give you access to the tool.

09:07 AW: Hey, I like that even better. Do you guys often do that with features? Do you do an open beta with any of your customers? Is that part of a feature release for you? 

09:16 DS: It is, and we will do that for our Local Citation Finder. For a free little tool like this, we probably wouldn't do it. But we are launching a new version of the Local Citation Finder. It's currently getting put through its paces by our internal team, and there's lots of stuff we still have to do before we would open it up to some of our customers. But, yeah, I would imagine we'll have a couple of weeks where we collect feedback from customers, and once we get it to the stage that we're happy with, and then that will probably come up with another few weeks or month of feature updates, and then we'll launch it officially.

09:47 AW: Nice.

09:48 DS: How about you? What do you guys do? Do you invite your top clients to stress test some your new features? 

09:53 AW: A majority... We look at certain things that it is a must, so we go through internal dev servers, then what we actually do is turn every feature into a flag where it can be turned on and off in our admin panel internally. And then we'll do what we refer to... We just call it private beta, and it's usually us just turning it on inside of our own accounts where we maintain some data or businesses. So many things make so much more sense when you're looking at real data, instead of just stuff you're inputting or trying to use without a business case.

10:26 DS: Yeah, definitely.

10:27 AW: Yeah. And then a normal process that we've moved to is we just do an open invite on our monthly webinars. We just did our January webinar last Tuesday, and we actually have two different features going into beta in the next few weeks. And so we just put out there to send an email to our support team with beta invite in the subject line, and then we invite those people into the beta when it's ready, send them instructions on how it works and things to be aware of. And then if it's a larger feature, we keep track of all the feedback and comments that come in, and then some of the times, we'll even do a survey at the end of it anywhere from eight to 10 questions on is it intuitive, is there something you'd add, just trying to find out more about their experience with it. And that process... We really solidified our process around releasing a feature last year, and it's still a work in progress. We're trying to always optimize it, but it has really served us well, and we've just come leaps and bounds, especially in the last 18 months of that being a more formalized process.

11:29 DS: Man, that's nice. That's smart. We are very not organized like that, so I think that I will take some notes on that. And when we do do our proper release for the Local Citation Finder update, we'll follow that process exactly. It's a huge release, it's like we rebuilt the whole software. That release is massive, and that'll certainly be the topic of... It'll come up many times in this podcast as we continue to push forward on it.

11:56 AW: Yeah. Those big ones are the hardest. We have the same thing last year when we rebuilt... When we added our client level and introduced an agency dashboard for all of our agency resellers. There were just all kinds of reasons behind it. It was one of those things, and if we didn't get it structurally in, we were gonna wind up with just some issues in how we delivered features to our agency resellers, which is a very important segment to us. We have three buckets of customers. One is a single location small business. The next bucket is multi-location business, might be anywhere from five locations, physical locations, in their business, to 10,000-15,000 locations of their business. And then the third is that white label version of that agencies and resellers, we have hundreds upon hundreds of agencies and resellers that resell our software.

12:45 DS: In terms of number of locations, that middle tier is the biggest? Right? Like multi-location brands that are working directly with you? 

12:52 AW: Yeah. As far as location-wise, definitely that brings the biggest girth of locations, because you can have one client that has tens of thousands or one client that has a thousand. Location-wise, which is what our pricing model is built off of, that's definitely the one, the biggest. But our agency reseller is so important because it has so much expansion in it. You land an agency and they start selling it, in their first month they get four or five customers on, and then they can just continue to organically do that. They're out selling it and pitching it to their customer base. And so we have resellers who have north of a thousand locations, all by themselves, that are all their customer locations.

13:33 AW: As we found, some of the thing behind the the client level and the Agency Dashboard we put into play was really helping a lot of our resellers sell to more multi-locations. Just there were some challenges in how structured the system was. That was one of those, for us, where it was like a three, four-month feature, and held a bunch of other things up that we couldn't release because it wouldn't work for agency resellers and only our direct customers and we never want to do that. 

And a lot of learning things in the process. So, even though we have a pretty refined process now and things detailed out, and we're still working on it, that's only because we've learned the hard way through operating loosey-goosey, and as you go, and just one foot in front of the other. And it usually leads to frustration and some missed things and things like that, where we just started looking at, "How do we use process to shore more of these things up?"

14:26 DS: Yeah. I need to learn from that, because we basically do that one foot in front of the other until it's like, "Hey, we have something, here it is, try it." And I think that refining our processes would be very helpful.

14:36 AW: We're always willing to try things, too. Last week, we had our exec team summit, and this was one of our big topics is how our sprints work. And in 2018 ...

14:48 DS: I wanna hear this.

14:49 AW: Yeah. In 2018, for the most part, we put a sprint together with X amount of features and say, "All right, this is what we think we can bite off," mostly looking at it in about a six-week cycle, four to six-week cycle. And, of course, some a little quicker, most a little longer, depending upon if you weighted them out the right way. And then we have that one really big one that took a big chunk out of the time. So, in our rear view in 2018, we completed eight sprints in total, with one being literally a three to four-month sprint all by itself.

15:24 AW: This year, one of the things that we met on that we were looking at is, "How do we have some more predictability, and are we open to trying something a little bit different?"  Because we could definitely continue on the way we are going, but we wanted to see, based on predictability and time management and the marketing aspects, and things that help that way in your business, should we do something else. What we're trying for this year is a combination of a couple different things. 

One, we pretty much looked at the calendar for the entire year and weighed it out, and said like, "Okay, let's have a feature of the month. This feature is something that we are absolutely going to deliver in this month." So, we need to backdate stuff in our process so that we know we can launch this feature in March, and we know we can launch this feature in April.

16:16 DS: Do you ever put some features in the bag, where you're like, "Okay, we finished both our March and April features, and so we're just gonna not release those until March and April?"

16:25 AW: Yeah, we'll be willing to do that, if that happens. Because the other thing after that high level is we basically have this huge cascading list of customer requests, and other ideas, and medium and smaller things, and we basically treat that as a running list, that if we're sitting good with when we need to deliver this bigger or the "feature of the month," then we'll just start grabbing from the prioritized thing in that list, which literally there's hundreds, so there's no shortage there.

16:56 DS: And some of them they can be banged out in a couple of days, so you're like, "Okay, well, we'll have no problem meeting that for our March feature of the month."

17:02 AW: Yeah, right. That's what we're going to try for this year and see how it goes, instead of just really defined sprints that are in a four-week, a six-week... This is something I definitely pay a lot of attention to and listen, read about how other companies do their sprints, and you hear the opinions on what's too short and what's too long. For us, we have a really productive engineering team, and so I don't want to ruin that, 'cause I know some of that is by not having just a ton of rigid guidelines to how and when it's going to happen. And so it's like, "How do we get some of the predictability to hit some of our other goals and some of our other planning?" The nice thing for me is it allowed me to lay out every month of 2019, and say, "All right, this is when we're going to deliver these things," and I know, if I needed to swap something based on a really big need or some type of vision or whatever else, I'm going to have to pull something else out of that feature of the month category and then hope it can maybe find its way into the top of the other list to get handled. It helped me, vision-wise, for the entire year as a planning exercise.

18:11 DS: Yeah. If I think about our process, if I say, "Okay, here's our six-month sprint or six-week sprint, these are the features we're going to have done in that six-week sprint." Let's say there's 14 features in there, I know by the time we get to the end of it we only got nine of them done. How do you deal with that, where you're like, "Oh, well, we were overly optimistic... " 'Cause everything always tends to take longer than you expect and there are always things that come up that you couldn't expect. I couldn't predict that this was gonna slow me down for three days while I was trying to build that feature, so how do you... Is there a disappointment at the end of it, or do you just take those extra six features and throw them into the next sprint? What do you do when you're laying out these sprints? 

18:53 AW: Yes. Sometimes they do get talked into the next one, we have to shift it over. And sometimes some of them... We had a couple that just would float and float and float, and for different reasons. It's like we just released this week, our Google My Business Authorization, that allows customers to reply to Google reviews right out of GatherUp and that also basically enables the monitoring straight...

19:17 DS: Yeah, love that.

19:18 AW: And their API. And that's something that we've had in beta and working for customers that asked for it for probably a year. But it was one of those that just never got tied off because of a lot of just quirky things with it. And then Google updated their API in the middle of when we were working on it. So, it's one of those that stretched on and on forever, and it just finally ended up being something where I was tired of seeing it on our to-do list. It was like, "All right, this is it... We're almost to the goal line, we have to get this across," and so we had to buckle down. 

And it's a much harder thing to test, too, because you need people with access to the Google My Business accounts to be able to authorize it, and then see all the different errors it could throw off, if it doesn't go through. There's just a lot of edge cases with that kind of stuff, so it was a much harder one to deal with than when you build a feature that the data is in your control, and the functionality is in your control, and everything sits on your side. It was one that lingered on for a long time. Yeah, I don't have an exact... This is what we do when those things fall off, because we've had ones that kinda trickle along forever and we have other ones that just get buttoned into the next one, and then get tied up.

20:27 DS: Sure. Well, I'm only asking because I don't really structure these six-week sprints, I don't really do anything called a sprint. It's just like these are the projects, we have a massive to-do list, we drag things around and prioritize them, and just things are getting picked off here and there. Features are getting completed, and eventually the project makes its way up to launch. We're like, "Okay, we're good, we're gonna launch this thing." But I don't plan it out in advance into these sprints. And I feel like if I did, I would be disappointed. I'd be at the end of every one, I'd be like, "Oh, well, I guess, we were overly optimistic." And so I'm just wondering how you handle that.

21:01 AW: Yeah. Well, don't be afraid there. Don't let fear control you.

21:04 DS: I'm not afraid, I'm all right.

[chuckle]

21:07 AW: Yeah. And that's where... Who knows what the process, we're doing. We could, after this year or even mid-year, if it ends up being something that really isn't working for us, go back to how we were doing it before in more of a traditional, "Here's a sprint, here's the six things included, and here's the six-week timeframe, that we're gonna try to get that done."

21:26 DS: Yeah. One thing that would be nice with our loose way is that there's never a deadline, and so no one's got a fire under their ass to get it done by X, right? 

21:36 AW: Yeah.

21:36 DS: And so that's what the sprints would give you. In one sense, we do have deadlines, because I have a weekly call with the different dev teams that are working on different projects. And in that call we always say, "Okay, what's realistic that we could have done for our next call, for us to review on our next call?" And so these weekly calls is what keeps us pushing forward. And sometimes we're also too optimistic on that. We might pick four things and say, "Okay, we'll have these four things working for demo." But when it comes to it, it's like, "Wow, we were only able to get these two or three done because we got road blocked on whatever." And so that's kinda... 

We set these weekly deadlines. And that's interesting, it's something I started doing based off of a conversation I had with Dudley Carr, who used to be with Moz, he used to be on the Moz local team, and he's building a new product now and directing a team, and he said that that's the way he organizes his sprints, he has weekly sprints, and every week they check in and, "Did you get these things done? If not, why not, what happened?"

22:35 DS: And he's also really... This is something I have to get better at because it keeps coming up. He's got this laser focus. Everything that comes in that could potentially distract the team, he's like, "Can it wait till next week?" 'Cause then he just puts it on the next week's sprint. Or, "Can it wait till next month?" And so he's really like he does not want to take people off the track of the four things that were supposed to get done on that sprint. And so that's one of the problems we have here. It's like, "Okay, well, the team is working on the Local Citation Finder, but we have these clients that keep chirping." They're like, "Oh, hey, we'd like to have this on a Rank Track or we'd like to have this somewhere else." And so someone's asking for things all the time, and every once in a while, we're just like, "Okay, let's just get that done." And so it takes us off track.

23:20 DS: And sometimes it works out well. Just last week, or just this week actually, we launched a white labeling feature for our rank tracking platform, so a business can now get their... Our rank tracking on their own sub-domain. I'm pretty excited about that. We now have that feature. We didn't have that before, but it's the kind of thing like, Troy is like, "Okay, I guess I'll do it." And it takes him an afternoon or two to get it done, but it's also a little "ugh" for me, too, because now I know I've pulled him off of the Local Citation Finder, which when we have our call next week, it'll be like, "Well, I could have done more, but you asked me to do that rank tracker thing."

24:00 AW: Yeah. So, what you're getting at, that's probably the biggest thing I know I struggle with and I know is on me to always really drive, and that's prioritization. What is a priority, and then how do you set it up for the things that are behind it? 'Cause everything you're talking about, we face it no different. Compared to what you're sharing... A one week sprint for us would never ever work, our product's too complex now. We have too many dependencies, we have multiple user types of our segments, and agencies use things a little different than multi-locations do, and we need to build it so it works for everybody. So, that shorter timeframe is definitely out for us. But we have the same things in that time when we're working on something where, yeah, there's customer needs, the sales team has needs based on something they're trying to close, and if we had this feature, then we would win that deal, and where do you put that into place? And it's hard. You have a product manager that is trying to protect the team and make sure that those things stay within reasonable planning and are reasonable to do.

25:06 AW: I think that's hard for every SaaS company. I think evolution is important. To me, it's just being real with what can be done, what is that list. That's the one thing that I really liked about... And again all of it seems like a great idea now until we put into practice. But the feature of the month thing made it so you could only get one big feature each month, and you had to lay them out. So, it gave me some prioritization and some discipline, 'cause there are certain things I'm looking at, like, "Man, I'm not going to get that feature until August." The other things in front of it, that absolutely makes sense to what's there. 
The one thing to your comment earlier, I think it's absolutely important for people to have a finish line. When you don't have a date that the team is working towards together, and there's the peer accountability that goes into play and a well-communicated deadline, and you're able to cheer people on for it and hold their feet to the fire even and everything else, to me that's a really slippery slope not to get where you want to. I just think that's really important.

26:08 DS: For sure. It just becomes too easy to just be floating around doing all kinds of little things that maybe aren't the top priority. Prioritization is really the key. And I think part of the thing that I always look at is, when I get these feature requests, I'm always trying to assess is this a feature that's only going to benefit this one customer, or is it a feature that will benefit all of our customers, make our product better, help us sell more? And then that's kind of how I... I sort of put them on the scale of where they fall in there. So, if they're really valuable, they'll get prioritized higher. If they're really obscure, then they get to put to the bottom of the barrel, and we don't really tell the client that requested it, we don't tell them a deadline. We're just like, "Yeah, we'll put that on the roadmap," but in the back of our minds, I'm like, "I don't know if we'll ever get to that."

26:57 AW: Totally. I think the important thing with customer requests is actually peeling back the layers to what they're actually getting at, right? Because a lot of times their request is just the hack or the easiest way to get there that they can see in the product, instead of actually really pinning down to what are they trying to learn, what are they trying to execute, and then I look at it like if we built a feature that could do that, that would solve this for them. But then are there other things that this would also solve or value or benefits it would deliver to the rest of our customers? 

27:28 DS: Yeah. And you can help them see a different way of getting at the same results, but with more benefit and also with a broader feature set benefits your whole platform.

27:38 AW: No, that's how I mean. Especially the client side is really, really hard, because you'll have all these different voices, different use cases and there's a... Based on if they're the loudest or they're paying the most, co many different things can factor into who and what you have to listen to. And even your own internal team, look at, "Who's bringing this to me? Is it our customer success team? Is it our sales team? Is it our management team? All those things factor in, and that's where you really have to weigh through them, and like, "All right, which one of these am I actually gonna grab a hold of and take and do something with?" 'Cause when you look at all of it... We have the same... Probably as you, we have a stand-up all-team meeting every Monday, and in each department section, there's feature requests in all of them. And so it's kind of picking out where I see it, I don't know if the team always sees it that like, "Hey, guess what?" There was actually 13 feature requests in our meeting today. They saw the three out of their section and maybe not so much the cumulative of all of them from every aspect.

28:41 DS: Right. Yeah, I noticed that actually Ahrefs uses a really interesting product called canny.IO, and they're using this to keep track of all of those customer requests that are coming in. And then there's a voting system. So, if you have a feature request, first you can go and see if it already exists, and if it does, you upload it as a customer. And so they really use this to help guide their features and understand what's important. And it pops up within the tool, and it prompts you to request stuff. And I think it's really smart. I've been looking at it, and I have our customer success team testing it out and getting a sense for it, and we might actually roll that out into our product. It just seems like a really good way to get all of that, 'cause there's so many customers that aren't gonna tell you. They're just like, "Oh, I wish it had this feature," but they don't say anything about it. Prompting them, I think, could be a great way to drive that customer feedback.

29:36 AW: We have definitely... We've talked about that in the past. A couple of things that we do do. One, we use a product on the product management side called "product board," and we use that to collect other requests and everything else, so we can see when some of these requests are doubling up. We use it for a capture side. It has a lot of other features as well. I don't think we're maximizing it, by any means, but it definitely helps us in certain areas. The other thing we've done for the last two years is we usually send out in the summer a customer survey that's probably only 10 or 12 questions, and we ask, "What feature in our product could you not live without?" We want to understand what's really important to them, we want to understand additional ideas, understand where their head is at.

30:21 AW: Really, at the end of the day, for me, it's a combination of those things, because, one, your customers aren't always gonna get what your long-term vision is or where you're trying to evolve to. I have that in my head and my gut, as well as other members of our team that come up and throw ideas into the hopper that we elaborate on. Then you have what customers are asking for. Then you also have what competitors are building and what they're doing, because sometimes you have to build things just to keep up with the Joneses in certain areas, and you have to decide, "Am I fine with this being a differentiator between our two products based on what our vision is and maybe what they're trying to be? Or do I need to have this because this has become the expected feature in this product category?"

31:01 DS: Yeah. And your sales team can help you vet that, too. So, they're like, "Oh, we lost this deal to X competitor because they have this future and we don't," and so that's where you really have to prioritize those things.

31:11 AW: Yes. And sometimes it makes really nice. I deal a lot with our multi-location sales and I've been really fortunate lately that there's the same one or two features that they're all talking about and mentioning, and that makes it very easy for me when I go to build our use case why we should do it, prioritize it, have it be one of the features of the month, that this is something that customers of this size, that would mean this dollar volume to us, actually want and wanna see. And then we go and marry it against what our vision is and what we're trying to accomplish. If it checks out there, then you can move on those things. But it's a moving target. It's very challenging. As I mentioned, prioritization is really hard and, yeah, we've developed a lot more processes and feature... Our own internal feature set and how we do those things, and I feel like we get better and better every time we optimize it, and tweak it, and try something whether it works or not.

32:05 DS: Right. We were inspired actually by your customer survey. We got that customer survey, and we thought we definitely need to do something like this for our Local Citation Finder. And we started putting it together, but I realized... Some of the questions, like, "What is the one feature you would really want," or, "What is the one thing that really annoys you about the product that you wish it could do, but it doesn't do?" All these questions. I already know all the answers to those, because they're the same things I want in the tool, the same things that bug me about our current tool. And so we actually scrapped the idea of sending it out because we're in the middle of rebuilding it to my dreams. 

We're building my dream version of this software right now, and so there's no point asking everybody, because they're all going to answer the things that I already know. And I also know it because of our cancellation form. When someone cancels, we ask them why they're canceling. And so we've been collecting that data for a long time, and it's all right there. It's like I know exactly why people are cancelling, what they want that we don't provide, and so now our vision is to build all of that and we will... Once we get the new version up and running, and everyone's using it, then we're gonna survey them and see, "Okay, well what's next? What are the things that we didn't catch?"

33:12 AW: Yeah. Well, the only thing I would caution you... There's a couple of other second level benefits. One, you're reaching out and getting a touch point with your customers...

33:20 DS: Right. That's helpful.

33:21 AW: In one that's structured that says, "We care what you have to say." And to me, that's a really important thing, even though if you have a great idea on what they already might say, now you're giving them a chance to be heard. And then when you go to create it, even if it's all the things you already heard, now that customer feels like you listened.

33:37 DS: That's such a smart point. We're creating that connection. What you just said makes so much sense, I'm gonna do it. We're going to launch that thing, because then even though everything might be what they asked for, we're going to deliver it. "You asked for it, we deliver it. We're listening to you, we care about you." Oh, my God, yes, we definitely have to send it.

33:55 AW: And it helps them realize, too, that you've queried the crowd. So, it's not just like what “Darren decided” or the team decided or whatever else. They understand, "Hey, they did to some extent, they put this out for a vote. They let everybody have a voice, whether we took the time to fill it out or what we said. And even if it's not exactly in alignment with what I put out there, I know they were at least listening to us. We were one of the data points they considered when they made those choices." I think there's a ton of wins, even though you might be exactly right, you might 100% know what's all on there, but there's a lot of validation and just good karma, I think, by engaging with your customers that way.

34:32 AW: Cool. Well, hey, speaking of that, as we wrap up, next week, when this airs, you're going to be probably engaging with some of your customers face-to-face, because you're going to be at Local U Advanced in Santa Monica.

34:44 DS: Yeah, looking forward to that. There's not very many local search people, other than you, that won't be there. It's pretty much everyone's gonna be there. So many of the local search people will be there, so I'm looking forward to seeing everybody. I can't wait see Susan Staupe. Susan is gonna be there.

35:00 AW: Nice.

35:00 DS: Yeah. She won one of our tickets to the event. I'm looking forward to hanging out with everyone there.

35:05 AW: She's going be super pumped.

35:07 DS: And you're going go to SaaStr. Right? 

35:08 AW: Yeah. I'm very bummed to miss LocalU. I'll be just north of you in San Jose at SaaStr, which obviously, biggest conference in the SaaS industry. I'm really excited for that. I'm bringing one of the guys on our team that's only been with us a year. He heads up all of our customer experience and interface design stuff, and just excited for him to just see how big our industry is and just absorb a lot of the industry talks.

35:35 DS: Nice.

35:35 AW: Things that we get after all the time to what we're talking about now. There's tracks on product, there's tracks on churn and sales, just so many of those aspects. That's really exciting. And hopefully, by the next time we talk as well, your new little free tool will be out. We can talk about how to launch that and what the uptake looks like.

35:56 DS: Definitely, yeah. It'll definitely be out within the next couple of weeks. [laughter] But you know what, two weeks is this magic number. I always say it's going to be out in two weeks, but...

36:06 AW: Well, maybe you'll take some of what we talked about, and you will set a deadline and you will make sure you hit it, just to have some content for the next time we talk. Right? 

36:14 DS: I'll definitely have content. We'll be two weeks further ahead anyways.

36:18 AW: Nice. That's awesome. Well, it's been another great episode. We now have two under the belt. Hopefully, you guys enjoyed our talk about how we approach some of our sprints and releasing features, and then the things that all go along with that. I don't know about you, Darren. I was really excited to see the amount of people that shared what we're doing socially, some of the comments, starting to see a few reviews trickle in. We would love more reviews on iTunes, especially, and thanks people for sharing what we're doing socially. And don't hesitate to reach out to either one of us if there's a topic you'd like to see us tackle or share about what we're doing inside of our company. That's the whole point of this, is sharing things that people might want to know that don't... It's not always easy to get a look inside other companies and what they're doing. 

With that, episode two of the SaaS Venture is a wrap. And thanks everyone for listening, and we'll see you next time.

37:11 DS: We did it. Thanks, Aaron. Thanks everybody for listening.

[music]


The Saas Venture Podcast from Aaron Weiche and Darren Shaw