Dave Masters | How I Tested Buying and Selling Homes

David J Bland (0:14.903)

All right, in this episode, we have Dave Masters, who is the Director of Product at Realtor.com and a Product Management Mentor. Thanks for joining us, Dave.

Dave Masters (0:26.027)

Oh David, so good to see you. So glad to be here. Thanks for having me on.

David J Bland (0:30.271)

Yeah, I was thinking back when we first met, it was quite a bit before the book, actually, and we were working on testing some stuff and you were gracious enough to even be a case study in the Testing Business Ideas book.

Dave Masters (0:42.410)

Yeah, good time. It's fun to go back and sort of relive those times. It felt like a lifetime ago but it was a bit.

David J Bland (0:52.871)

Yeah, I know. And so I guess maybe we can start there with, you know, your thinking around, you know, how do we do this testing in a big company? How do we break out of this kind of, you know, I think there's this perception of a feature factory mindset of every big company in the world. We're just like a bunch of feature factories. But I'm just really curious, like coming back to that case study, there was only a couple paragraphs we could include, you know, in the book. I was just really curious.

What came about? What was that nugget of, oh man, I really feel like I need to test this before we build this out. Maybe you could clue our audience in on that.

Dave Masters (1:28.530)

Yeah, sure. So if it's helpful, maybe I'll set up with just a little bit of context, David, on the story. So we had a new product that was aimed at homeowners. So at Realtor.com, you know, we basically think about the entire lifecycle of home shopping, ownership, selling, etc. And one of the products that we were, you know, working on my team was an owner dashboard. And when we talk to homeowners, a large pool of the people that we had within this dashboard,

selling. Now, so we started to talk to that population and we heard a lot from them about

the juggling between the buying of their next home and selling their existing home and being able to time the market and things of that nature were literally hurt it over and over and over again. So we were pretty confident that was a real problem. But one of the things that we had, so we have tool, we're also a pretty mature company. So we have tools all around Realtor.com. And we had just this really simple insight that said,

and this feature over here what if we could sort of pull those two things together and just bring together you know this feature to a sort of a different audience and so as part of the test case you know one of the things that we really wanted to do was just go alright we heard this does the value proposition of bringing these features in the system actually like

matter to these people and so that's all I did. I literally went and took the words they said, created a quick little guide and boom we're off to the races. I'll expand on that story whenever we get to that moment. I would say it was pretty low risk in the sense that we weren't having to necessarily build something brand new from scratch. It was also on a platform that the audience was growing. It wasn't like our most important pages by any stretch of the imagination.

Dave Masters (3:29.504)

As I've evolved my career here and as I've evolved my thinking around experimentation it's like those inputs to your process really matter. A page with high traffic versus a page with limited audience, those risks really vary as you get into that.

David J Bland (3:52.251)

Yeah, I think we have listeners that are startup entrepreneurs. We also have listeners that are innovators at bigger companies. And I think when you come back to where these tests come from, where do these ideas for these tests come from, it sounds as if you were already doing customer interviews and this is something that was coming up during those customer interviews. Is that correct?

Dave Masters (4:14.646)

Yep, definitely.

David J Bland (4:17.503)

And so from there, it's...

Pretty light evidence. I mean, people are saying something, but you want to get to, okay, if I was going to build a solution for this, would they really use it and everything? So it sounds as if you and your team got together and started thinking through, okay, how would we go test that? And yeah, if you could just dive in a little bit of, it's in the book, but I thought maybe expanding upon it a bit here on a podcast together would be fun for the listeners. So what are some things that you did after you heard those insights from the interviews? Like kind of walk us through your step-by-step of how you began.

to design and run a test on that.

Dave Masters (4:52.402)

Yeah, so you know and I would I'd really encourage anybody who hasn't done like any one of the business model canvases or lean canvas whether you're like an existing product or entrepreneur whatever sort of level I think those really help set a nice foundation that kind of give you a sense of where your opportunities lie. So you know we our team put together a quick canvas we saw

Dave Masters (5:22.856)

digging up the screenshots this morning and kind of chatting through this. We like to use the words that they used verbatim in some of those interviews to say, these are the words that we heard over and over again. We're gonna put that there and say, do you have this problem? Yes. And then, but what we did is, you know, as we started to look at the different things that, the problems that we had heard from these folks, and we said, all right, look, we could build something new. We also have a couple of these features that already live across Realtor,

able to satisfy some of these needs but they're used for something totally different today. So you know and the feature was basically being able to kind of compare different market insights. So what I mean by that is if you punch in a zip code on realtor.com and certain areas it'll tell you like how many homes are available in that market, how long they last on site, the median days on

and some other stuff. So if you take those insights as someone who's in the market that you're looking to buy, you do the same thing and compare it to the market that you're looking to sell, now it's like you can get some quasi insights that basically say.

Yep, I should sell here first because houses here take a little longer to sell versus how quickly the market moves in the other place. Those little things that we had thought about maybe just repurposing for this particular audience and combining those insights could be a useful first pass to solve that problem for people.

Dave Masters (7:4.912)

instrumented a product called Pendo and within Pendo they have these features called guides and so with really limited engineering support at the time limited design support we basically kind of comped up a little button on the within the experience that button then opened up quote-unquote one of these guides and we use that guide to just say

Tell us which market you're selling in, great. Tell us which market you're buying in, great. And then effectively a little promise that said, hey, within the next 24 hours, we'll send you a report to help you understand the sort of market insights. And that was the gist of the test. And now the promise of that was ultimately me having to...

you know, packaged together some of these insights for people. So I had hoped for a number to sort of develop some evidence and realize that I was going to have to complete those reports myself.

But I didn't expect it to be as high as the signal that we got, which ultimately meant I had to spend quite a bit of time packaging together those reports for these users. So it was a good signal, good evidence, and ultimately I think it was the right path for us in getting that detail and merging that feature that already existed and bringing it into the dashboard.

David J Bland (8:37.299)

A couple of things I want to unpack there. One is, I think, very insightful, where you took almost verbatim the words you were hearing in your customer interviews and put that in the language in the pop-up. I think even though you made that connection, I find a lot of people don't. We take great notes, we look at our interview scripts, we review everything, but then I don't see teams take the quotes from there, which I do think sometimes...

we err on the side of, well, we've got to make this feature better and faster and add more things and all this. And really, the problem is that people don't understand what we're saying. And so I think that's very insightful of taking quotes from the actual interviews and using that in your experiment. I love that. And the other thing here was it seemed like a concierge, what we would call a concierge test, what you did. And so that doesn't scale. But I think

you're showing why we don't want it to scale. Because it sounds like you had an influx of demand that maybe you didn't anticipate, and therefore you had to live up to your promise by manually creating these, which took, I don't know, somewhere between what, 10 to 15 minutes to create each one?

Dave Masters (9:52.258)

Yeah, that's about right. Yeah, and look, I'd say, you know, it felt natural, because we had heard the same sort of quotes over and over again. It was like...

just are you looking for this? Because here we've got a tool that helps you do this thing. So that language felt natural. I think you're 100% right because oftentimes it's like we're looking for the quote unquote value prop that we want to deliver with whatever the feature is. And we market that with marketing language and try to sort of like be as concise as possible. By the way, I looked at the words that I use. I'm like, all right, this is probably a little long. I don't know if this was such a great idea.

you know, it clearly resonated because it was, you know, visible for people and it took quite a bit of time. And yeah, then look on the concierge experiment, this was not my first time doing this type of experiment, but I find them super valuable because it's like, it's a direct connection as a product person to your customer, right? So in the test, it's like, we said that we had this value proposition for you as a customer

I'm going to deliver this thing to you, granted it's going to be manual and it's not going to be perfect but it's going to be the promise that we're saying basically on this quote is here and I'm going to aim to deliver that and then we're going to talk about it or I'm going to at least have access to you to say was this helpful, was this not helpful and sort of now you could be my next round of customers who I talk to continue to refine this feature set. So that connectivity between you know yourself as a product.

person and your customer is just embedded in these types of concierge experiments and so I really love that model and that kind of experience.

David J Bland (11:47.575)

I was thinking something you mentioned there. So it sounds as if you, if I remember correctly, you sent them an email with the report that you had handcrafted or pulled from manual internal tools, data sets. And then you had, I believe you had like a Calendly link or something where they could reach out to you directly and give you feedback. Is that how you structured that experiment? Is that correct?

Dave Masters (12:15.350)

Yeah, so basically it was, you know, we had these market insights and it was literally you could just go and punch in a zip code. And so I just went punch in the zip code for the market that they were selling in.

I screenshot it, then took the market that they were selling in, screenshotted it. I mean, the feature, just like I said, it existed already across Realtor. And so I just copy and pasted that all into a PDF for them and then attached the PDF into the email that I sent directly to them. Yeah, with the Calendly link and saying, hey, I'm here to help you. We understand this is a big problem, so let us know if this helps. And be happy to learn more about what you're trying to achieve to see if there's other things that we can do

build out a product set for you. So, yeah, and then, you know, but just having that link, they had my email, so I got emails back from a couple of other people, just, you know, both thanking me and me being able to kind of engage with them a little bit. Is this helpful? Let me know if you need, if your markets change or if you need anything, right? Just kind of can start that. It's very simple and practical and like, you know, and again, direct access that doesn't scale.

David J Bland (13:26.495)

Yeah, it doesn't scale, but it sounds as if you learn quite a bit and that you were closing the loop there where you would send something out. People would reach back out to you. You could talk to them about how they used it, what they liked, what they didn't like about it. So quite often we put stuff out there and we don't know how it's received or how people use it. So that's a really important loop to close. I'm curious, did you use any of those learnings to inform, you know, where you took the design of the solution or where do you what was after Concierge?

Dave Masters (13:32.248)

Yeah.

David J Bland (13:56.389)

and enlighten our audience listeners of what happened after you were getting those insights back from people after manually delivering PDF dashboards to them.

Dave Masters (14:7.363)

Yeah, so what we did was...

I'm going back exactly what happened but it was something along the lines of okay we know this feature exists over here how do we now bring that into their owner dashboard and kind of compile that with like a market insights thing. So that's literally what we did we just brought the feature directly in there so now people could go into their dashboard when they're looking at their home's value or whatever and they could just simply compare the two different markets by well they only had to compare or add a second market because we already knew enough about their home and whatever

there inside the system. So it was just, what are the zip codes would you like to add here sort of thing? And you could add them on and kind of compare the two different insights. So yeah, that feature was live. And then we rebuilt the whole owner dashboard after that. And that feature still exists in the new version. So, you know, it's like.

Yeah, that little insight just led us to repackaging and repurposing a tool that existed across RDC but for a more focused and niche part of the audience.

David J Bland (15:16.567)

So I love that journey. So you start with customer interviews, learning about specific problem or unmet need. You use the words of the people you're interviewing in the pop-up. You concierge, deliver what you think could be a solution for them, but you don't really build anything. You get feedback on that. And then you realize there's another part of the system, of the platform that already does this, you could pull over and give it to people in their time of need. So you really didn't, does it sound like you had to build much of anything?

to deliver value to your customers.

Dave Masters (15:49.622)

Definitely, yeah, that's right. And those insights still inform some of the way that we think about the product. We know that by just adding some insights doesn't magically make that problem disappear. Just gives them one additional tool to start to work through that. So we still have to continue to evolve our own product offering to say, hey, this product is, or this problem is really acute and a lot of people feel it. And so are there other things that we could do to kind of take that, take those learnings

bake them into our offering overall. So yeah, I think that the way you laid it out is far more concise and it probably didn't feel as simple as that at the time of it but it's like a repeatable process. And I had mentioned, it wasn't the only concierge experiment that I had done, I had done one

Dave Masters (16:49.476)

manually daily for some users and so It took that took a real toll because we based me and one other person on my team We were like packaging together these emails like every day for like 10 days It was good again good signal and a good relationship with our with our customers But boy that one was that one was time-consuming this one wasn't as bad

David J Bland (17:6.756)

That's it.

David J Bland (17:15.059)

It sounds as if, I'm wondering, is there a threshold? Well, I have a couple things about concierge. It's one of my favorite experiments out of the library. And I feel as if there are a couple things here. So one is, yes, it doesn't scale, but it can be really insightful, but somewhat time consuming to put things together. I'm curious.

Have you ever maybe internally saying, oh, if I spend more than n number of hours on this, I'm gonna automate it or how do you know when it ends? You came up with a 10 day time period. I'm just wondering, how do you not get stuck in concierge forever with some of these?

Dave Masters (17:51.786)

Yeah, it's a...

Glutton for punishment David, I guess. No, I never actually never figured it out to be honest. I think, you know, when you're in the thick of it, and it's probably a, there's maybe something there in your experimentation template that you could put how much time is too much time to spend on concierge. Because yeah, look, it's, it can be pretty time consuming. And, but ultimately, I actually think spending that time up front is way better than spending

something and not hitting the mark either. So like, you know, you're either spending a little more time upfront, getting those insights, getting the learnings and sort of learning what to build versus, you know, all the other things where you spend a whole slew of people's time building something that goes, well, this didn't really work. So I'd much rather me bite the bullet or someone on the team bite the bullet and do some of that concierge work versus, you know.

Go on the opposite path.

David J Bland (18:58.343)

Yeah, it's interesting. I'm wondering, you've been around in bigger companies and you've been around other product managers. I know you mentor folks as well. I'm curious, what do you think are the pushbacks on doing something like a concierge? What do you think prevents product managers from signing up to do something like this and gather insights on whether they should build or not?

Dave Masters (19:21.194)

You know, like...

I think especially at a company like ours, when we hire product managers that have experience, we have a really mature product group who are all great and super sharp. And I think it kind of takes you a bit to sort of say, well, why would we not do this? And kind of a mentality change, I think. So.

There was someone on my team who was feeling a little stuck and we had to figure out how can we move forward. I remember us talking through this idea of a concierge and she was like, really we could do that? And I was like, yeah, why could we not do that? So I think it's that sort of mind shift change about the why not model versus the why do it model.

Dave Masters (20:21.450)

I think that there's no reason not to. I would say though that like, I mentioned earlier this kind of idea about the risk and what areas of the experience you're testing into.

really can blow out a concierge model quickly. So the pages that I was working on for some of these tests are a lot less trafficked. So if you were to light something up on one of our more higher trafficked pages and more valuable pages, I think it comes, it's a little harder to...

to do that because there's a lot of people there, the risk becomes a little bit higher, the volume that you might have to get or that you get might blow out real quick. So I think that you've got to bake that into your model about where and how you introduce a feature like this too. So

You know, I mean, that's not to say that you can't do it on those pages. I just think you have to be a little bit more thoughtful and people are probably a little more nervous about testing on some of those higher traffic pages. I don't know. I feel a little less concerned about that, but I think that those concerns probably prohibit people from thinking as starting that path.

David J Bland (21:44.671)

Yeah, I could see that where, or you'd have to segment in a way where maybe you don't run it for very long and you don't run it for everybody, but there's a very specific target. But then, you know, some companies I work with really struggle with segmentation. And if you don't have that segmentation, then it can be really difficult to control that kind of fire hose of traffic coming into your experiments. So I think it's easy maybe sometimes for me to say, oh yeah, just keep that experiment small.

But at bigger companies, it's actually somewhat challenging to keep that experiment small. If you can't segment.

to a specific sub-segment of your traffic and toggle things on and off easily, I could see where that might be very intimidating for people. So yeah, I could see that. I'm curious, what other, so we're talking about concierge quite a bit today. What other experiments have you really enjoyed over the years? Like what are some of your favorite ones outside of concierge that you've had fun running, trying to test some different ideas that you've had?

Dave Masters (22:33.282)

Mm-hmm.

dave (22:44.206)

Well, you know, and I think in the book, you know, customer interviews are sort of a staple. I think those are critical. I think the more FaceTime you can have with customers or your users is super important. So I think that should just be an ongoing rhythm. You should always be doing that.

I let's see we have done Wizard of Oz experiments where the end result felt like a magical product though that was also handcrafted and curated by someone on the back end. So I think that's the sort of the delineation right. It's just this concierge can feel scrappy and feel sort of like there's someone on the other end whereas Wizard of Oz just is basically the same thing but the magic all kind of happens

It feels magical, but everything happens behind the scenes.

David J Bland (24:16.511)

Yeah, it's like all the AI startups I mentor coming through Silicon Valley that are doing it all by hand behind the scenes. If you're like, but it's but eventually it'll be AI. Kind of the end customer doesn't care as long as they get the value. So I'm curious, did you have some of those concerns with Wizard of Oz or anything where, oh, we can't do this and, you know, the customer is going to get value, but maybe they don't know how they're getting the value or it doesn't sound like you're nodding your head. It doesn't sound like you had many of those concerns.

Dave Masters (24:23.163)

Yeah, yeah.

Dave Masters (24:46.310)

Well, like I think the thing about the reason that I would say doing a Wizard of Oz experience versus a Concierge experience is

you know, the Wizard of Oz will feel a little closer to what the end product might be. And so I think there's another kind of, another validation point versus the concierge one, which is like, you know, me sending a personal email to 80 people with their own personalized dashboard doesn't feel like a magical experience, but it does solve the problem. Whereas, hey, if all of a sudden, you know, we could build this beautifully branded experience

sort of felt really elegant versus a pop-up modal that didn't have a whole lot of style going with it, right? Like I think it sort of kind of takes your experience from that kind of early adopter. I'm willing to test that alpha version of a product versus a nope, this is ready for all consumers and all types sort of model. So where you might have a little bit more risk. And so we've tried that.

I've done some of those Wizard of Osweims where it's like, yep, the onboarding of the experience feels really good and professional and it's like type form in the background, right? And then like a Google sheet and then something's kind of pulled together and the delivery of what it looks like in the end looks really polished and nice. But that's because we probably had concerns or risk that we wouldn't get those learnings through a concierge. Or we had the learnings through a concierge.

and now we were trying to validate that the product was ready for a bigger audience. Those would be some ways to differentiate which of those experiences to deliver.

David J Bland (26:35.847)

Yeah, I think what you're describing, if I'm listening and hearing this correctly, it's almost as if with concierge, you almost bias the customer a bit because it's obvious a person is involved and it might off put people who aren't early adopters who don't maybe want a person involved. But with Wizard of Oz, it's less about biasing people with the presence of a person, even though you're doing it by hand behind the scenes. It feels more magical.

in a way, so potentially, and I haven't really thought about this way, but potentially that could be closer to the real world experience of if we had a product that was doing all this behind the scenes, this is what it would act like. So that's a very interesting take on things where some of these, you almost have to say, well, if we're going to do a concierge or a Wizard of Oz, if I'm biasing people with my presence...

is that okay at this scale and what I'm trying to learn because it's almost as if you would, let's say you're testing out a vending machine and it's like, well, instead of putting a vending machine there, we're gonna put a person there just selling stuff. It's like, it's not the exact same as a vending machine, but it might buy some people. Some people might not wanna walk up to somebody and buy something and maybe a machine they'd feel more comfortable with. So I do think when we're talking about the differences between Concierge and Wizard of Oz, just the...

Dave Masters (27:28.843)

Yeah.

Dave Masters (27:38.154)

Yeah, yeah.

David J Bland (27:55.467)

presence of a person involved does tend to change the dynamics a bit of what you can learn in the in customer experience.

Dave Masters (28:3.426)

Yeah.

100% I totally agree with that and I think it was probably you that posted this day that where I saw this but like that idea of someone going to a coffee vending machine and getting an espresso and behind the machine is just literally a guy like Making the espresso drink and then serving it out. So it's like, you know kind of feels You know feels magical But it's not really and so Because there's somebody else doing that people aspect and that connect the connect

connection bias between a human and a non-human that definitely is going to have impact but like I'd say especially for early stage ideas like I'd you know

If you're really trying to get those learnings and start to understand if the value prop is there, like before spending all of the time on designing this thing and that experience, like I'd probably use a concierge as a step before a Wizard of Oz, you know, in a lot of cases, because I'd want to get those earlier stage insights before spending too much time. Well, that's, you know, frankly, that's probably before products were so easy to build now, like you can light up some pretty quick stuff pretty fast with no code tools these days.

So maybe it's really kind of thinking about that. Is this going to be harmful if I'm involved in this process versus it all feeling magical and not having a face associated with it?

David J Bland (29:30.771)

Yeah, I think that's a good way to look at it. And you're right, the cost of.

development, I used to say, you know, with a lot of confidence, I would say, well, building is the most expensive way to learn. And now I have to say, well, it's one of the most, but it's not always the most anymore because you can build so quickly. But I do think that leads to situations where you're building things and throwing it out and when people don't use it, you don't know why and you have to reverse engineer all that anyway. So I still think even though the cost of building has come down.

it still makes sense to do some of this discovery and understand the value prop. And a lot of the words you're using today in this interview of, you know, understanding the voice of the customer and understanding the pains they're experiencing and what they're looking for and all that. I mean, yeah, you could skip all that and jump to build, but then if they don't use it, you're kind of stuck guessing why they're not using it. So I'm just curious, switching gears maybe a little bit here.

So you're doing this inside of a big company. And I know a lot of our listeners are also at big companies and trying to champion experimentation. And I'm just curious, what tips or tricks could you give your fellow product managers that are like you in big companies wanting to test different things and feeling that maybe they're not empowered to do so, or you mentioned kind of why not? Why can't we do this? And that mentality, I'm just curious, you have any kind of...

tips or tricks you could give them on how to get started with, you know, even small tests inside their company.

Dave Masters (31:6.578)

Yeah, I mean, you know, and I'm probably preaching to the choir a little bit on this with you, David, but like, I really do think the more that you can identify the kind of...

the de-risking of an idea that's going to cost a lot of money and a lot of investment I think is a great way to highlight that. So, if you propose a way to test something that we want to get this learning and use things like assumptions mapping or some of these other canvases that really highlight the risks associated with a particular thing.

Dave Masters (31:53.712)

I like that because it's such a simple way to kind of think about it. This week we need to learn X insight. The way that we can do that is we're going to run a quick experiment that sort of gives us that insight. I just think if you kind of package it that way, it's hard to sort of push against that.

hey rather than us spending you know taking our time and energy and our focus off of the things that we're currently working on what if me as a product manager spends the next two days working on a concierge to sort of get more access to my customers uh learn if this value prop of this seed of an idea that we have is worthwhile like it makes it start to feel like that just

is the natural path forward. So like, hey, let's not lose focus on some of these other things, but we've got to start planting seeds for what's next. So in order to plant seeds for what's next, let's take the insights we've heard from people over and over again. Let's go start to think about, all right, well, we have these five ideas. Okay, well, let's pick one of them and let's figure out what is the riskiest part about that. Let's find a way to test that, and sort of get those learnings that we can then report back on whether this is the right thing to be.

build or not. And so, and I think the more that you can just sort of become a champion of that sort of.

de-risking mentality and I would say the other thing too especially as a big company when you're thinking about what you're testing each of these tests have some cost with them right whether that's people's time, whether that's a trade off for another feature or something that you're going to take away from other people, whether that's a little bit of engineering time. So really think about the upside that what you're learning is going to deliver because

Dave Masters (33:51.912)

You know.

what it's gonna cost to actually run this thing and what insights you're gonna get from it. So, I'd say you just gotta kind of build out your own model for how to champion that internally. And if you work in a growth team, frankly, you probably are doing a lot more of this type of stuff, right? Because you're really sort of rapidly getting through to figure out what's working and what's not working versus, hey, we're moving this metric a half percent and that really matters a lot.

so those types of experiments would be a little bit different versus we've got some new ideas, we planted some seeds for some bigger things that have a lot of risk with swinging the bat. So I don't know, as much as you could do that, that's probably the advice I'd give. Really thinking about what those costs and what it costs to not do it is probably worth balancing out.

David J Bland (34:47.359)

I like, it feels as if we're talking about, we always like to talk about how much money we're gonna make, but potentially you could be talking about how much money you're gonna save the company too, by not going down a path, building something that nobody really is going to adopt. So this idea of, hey, let's just test and see if.

there's anything there, anything directional that points to the sign that this could be big. I'd love to see more of my clients also just keep a running tally of how much money they've saved the company by running experiments and killing things. It may not be the sexiest side of business, but I mean, let's face it, we've all worked on something that we spent way too much time on and felt as if, wow, I wish we'd have stopped earlier. And I think having some kind of process where you can stop earlier is also as important

when he says generating revenue.

Dave Masters (35:38.910)

I mean, yeah, it's that kind of old idiom where it's just basically like, you know.

deciding what to do is also deciding what not to do. And so like as part of what you're building, like you're making a conscious choice to spend time on something. And so the more evidence you can collect to say, this is the right thing we should be doing, I think sort of just feels better to be going down a path that you have that much more conviction and certainty around. So the more that you can kind of build that case, I think the better overall.

David J Bland (36:12.055)

Thank you. Yeah, this is great. I love having you expand a bit on the case study that was in the book and get a little more color and context around that. I love that you do other concierge experiments and Wizard of Oz experiments and you're constantly talking to customers. I wish more people would see that as something constant and not a phase for customers and then move on. And so, yeah, I just want you to keep testing, keep sharing what you're

There are some folks like you in other big companies that they feel very alone, you know, they don't feel as if they can run tests and go off and do what you're doing. So I appreciate you sharing your stories. I want to know, where could people find out more about you if they wanted to reach out, maybe if they're working at a big company or somebody that's looking for a mentor, where can they find you?

Dave Masters (37:5.042)

I mean I would have said X, I'm not really on that anymore though too much. So probably LinkedIn is probably the best place to get me. And yeah I'm always happy to kind of chat about ideas and sort of giving guidance but yeah, that's probably the best place to get me.

David J Bland (37:23.423)

All right, so find Dave Masters on LinkedIn. He is available if you have some questions about anything we chatted about today. Thanks again so much, Dave, for joining us for this conversation. I really enjoyed it.

Dave Masters (37:36.853)

Oh, yeah, my pleasure, David. Great to be here with you. Great to see you as always.

Dave Masters | How I Tested Buying and Selling Homes
Broadcast by