,

The Sixth Sense of Real Estate: An Intro to Super Forecasting | S4E5

In this episode of the A.CRE Audio Series, Spencer, Sam, and Michael explore the concept of “super forecasting,” featured in the book “Superforecasting” by Philip Tetlock and Dan Gardner. The guys discuss the skills of super forecasters, people with a knack for prediction, and how they can be valuable in CRE. Through a practical forecasting exercise, they highlight the significance of the methodology used in superforecasting prediction.

Watch, listen, or read this episode for a deeper exploration of super forecasting principles in the context of commercial real estate.


The Sixth Sense of Real Estate: An Intro to Super Forecasting

Or Listen to this Episode



Resources from this Episode

Episode Transcript

Sam Carlson (00:00):

All right. What’s up, and welcome to this episode of this season four series. Spencer and I have been friends for a very long time, and I remember I went down to Colorado Springs recently and we were talking, we went on a drive, we were talking about a lot of things, and there was this concept that he was talking about, we’re going to get into it today, but he stuck something in my head and it was very interesting. So Spencer, what does the US intelligence community have in common with commercial real estate? Please expound upon that prospect.

Spencer Burton (00:38):

Yeah, so this entire season is about what’s next in commercial real estate. Right. And I think prediction is an important element to, I mean, it’s a key element to what we do in commercial real estate on a day-to-day basis. And so one of my colleagues at work loves this book, Super Forecasting. It’s written by a guy named Philip Tetlock, professor at Wharton and Dan Gardner. And the book is really about this phenomenon that there are certain people that possess certain skills of some innate, but most learned that make them better at forecasting than others. That’s in essence, the concept of the book. And so the authors go through what makes these, they call them super forecasters special. And I think about that from our perspective. It’s like, okay, I can learn something from that.

(01:37):

And if you look at the book, it’s a few years old. It’s based in a program that they began in 2011 that came off the hills of a landmark study they did from 20 years, over a 20-year period from the early 80s to the early 2000s that found, and this is incredible, that the forecast of a team of experts was about as accurate as a chimpanzee. No, I’m not even joking. I mean, it was quite controversial. A chimpanzee throwing a dart at a board. But they did find that there was a subset that was slightly more accurate than the chimpanzee. And they asked what made that subset more accurate. And then they asked, is there a way in which we could train or create a group that’s even more accurate than that group that was slightly more accurate than the chimpanzee.

(02:38):

And in collaboration with the director of National Intelligence, which is call it the office that oversees all of the intelligence agencies in the United States, they created this program to try to A, find out if the average person could be as accurate as these highly trained intelligence agents who had confidential top secret information. And if they could, what could they learn from that exercise to then make their intelligence teams, their intelligence agents more accurate? And so yeah. I mean, I find this fascinating because how can we employ that in commercial real estate? So let me start with an exercise. It’s kind of fun. And then I’ll get into some of the principles. I think the first point is go by the book, Super Forecasting. It’s one of the more valuable books that I’ve read in my career thus far. But let’s start with this exercise. So Michael, how many privately owned dogs do you think there are in Los Angeles?

Sam Carlson (03:48):

In Los Angeles?

Michael Belasco (03:49):

Los Angeles.

Spencer Burton (03:49):

Yeah.

Michael Belasco (03:50):

Okay.

Sam Carlson (03:50):

How many people live in Los Angeles?

Michael Belasco (03:52):

The metro or the city?

Spencer Burton (03:55):

In the city, in the,

Michael Belasco (03:57):

Proper city boundary.

Spencer Burton (03:58):

In the Los Angeles metro area. Yeah.

Michael Belasco (04:00):

Okay. In the metro. Okay. I’m guessing there’s about 12 million people in the metro.

Spencer Burton (04:05):

Okay.

Michael Belasco (04:05):

I’m just guessing. I think it’s roughly that maybe in Los Angeles metro. I’m going to say the average household’s three.

Spencer Burton (04:14):

Okay.

Michael Belasco (04:14):

So 4 million households. From there, I’m going to say, let’s say one point, you asked me how many dogs?

Spencer Burton (04:26):

Yeah, how many privately owned dogs? Yeah.

Michael Belasco (04:27):

1.6 million privately owned.

Spencer Burton (04:31):

Yeah, that’s interesting. Sam, how would you go about calculating that?

Sam Carlson (04:35):

Can I know how many, what the population of the,

Spencer Burton (04:39):

So it’s less about getting the right answer. It’s more about how you got there. So the first question, both of you are on the right track, which is, and by the way, this is just one methodology of super forecasting. This particular methodology is, I think they call it, to make sure I get it right, the Fermi effect. So Fermi was Enrico Fermi, professor of University of Chicago, worked on the Manhattan Project and he was put to work trying to calculate the strength of an atomic bomb. So they never set off. And so he did some back of the envelope of analysis to estimate this, right?

(05:17):

And so in the book, they actually use the example of, okay, how many piano tuners are there in Chicago? And so I use the example of dogs in Los Angeles, but what’s interesting is you both started right. And so and when you do super forecasting, you’re forecasting, say using the Fermi effect, you start big, the things that you can know, and then you do some back of the envelope to try to get to an answer that’s some close approximation. And the very first question is how many people are in the Los Angeles metro area? How many households are there in the Los Angeles metro area? How many dogs per household? And when you do that three step process, what you get is a reasonable approximation. Now you’re off, and probably because your very first assumption around people in households.

Michael Belasco (06:08):

Any of those assumptions could have been wrong.

Spencer Burton (06:08):

Yeah and,

Michael Belasco (06:08):

Multiplying effect. Yeah.

Spencer Burton (06:13):

But the point is you get to, I think it’s 5.2 million is the number of privately owned dogs, but I think where you were off was,

Michael Belasco (06:22):

Maybe households or,

Spencer Burton (06:23):

Dogs per household.

Michael Belasco (06:25):

Yeah. That’s surprising. Well, I guess I landed at 40. If the household was 4 million, I said 1.6 so.

Spencer Burton (06:31):

Yeah. So you were assuming one out of every three households had a dog or it was one dog for every three households. It’s probably two dogs or 1.2 dogs per household, whatever the number is. The point is, right, so this is one example of how to do this super forecasting. I find it really fascinating because in commercial real estate, we so often get in the weeds, but that back of the envelope is the very first step where we can get some approximation that is,

Michael Belasco (07:01):

Well, here’s the real question.

Spencer Burton (07:02):

Yeah.

Michael Belasco (07:04):

Are we in the category of those slightly above average or?

Spencer Burton (07:09):

Well, we hope, right? I think all of us are working on that. But yeah. So we can go a couple different directions with this discussion. We can talk about what makes a super forecaster, what are some of the characteristics, some of the techniques.

Sam Carlson (07:29):

Maybe can jump in here for just a minute. So going back to where I started this podcast, we were driving in the car and you were talking to me about what the core objective of real estate financial modeling is, and you had a very interesting way that you would share this information to a body of people. Could you maybe go there and then talk about just forecasting in general?

Spencer Burton (07:59):

Oh yeah, sure. And I mentioned this on a previous episode, but I’ll get into it a little bit more. So when I speak to students, I describe what we do in commercial real estate. We’re really forecasting the future. And I use the example of a meteorologist because those are forecasts that we look at every single day. So I was flying here, we’re in Washington state just outside of Seattle right now, right? I’m flying from Denver, and there were thunderstorms that night and I was concerned that I wasn’t going to be able to fly out of Denver.

(08:29):

And so I went to the forecast and there was some forecast for when the thunderstorms would be around and what sort of delay that would have. And then I did my, by the way, my own forecast, and what’s the probability that my plane would leave that night or if I’d have to stay the night. Long story short, the weather forecast was wrong. The thunderstorms were supposed to end at 10:00 PM. They didn’t. In fact, they got more extreme to the point that there was this wave of planes that were canceled. And I had actually made a forecast that my plane was likely going to be canceled. And so I actually lucked out and booked a hotel and changed my flight about 15 minutes before this mass wave of cancellations. But anyway, so in real estate though, it’s a similar process.

Sam Carlson (09:15):

Can I stop you one second?

Spencer Burton (09:15):

Sure.

Sam Carlson (09:15):

Did you actually do a forecast?

Spencer Burton (09:17):

In my mind, yeah. So I’m trying to be better at super forecasting in everything I do, and real estate’s one of them. It’s like, let me think, probabilistically.

Michael Belasco (09:27):

So walk through that process then. When you were forecasting in that moment in time,

Sam Carlson (09:31):

Well, I think he just did. I think he was just walking through what he did. I’m thinking, wait, did you, because you and I were waiting here for him, and I’m thinking, wait, was he actually running a super forecasting analysis on whether or not he was going to miss that plane? And he was. He was doing an analysis on it.

Spencer Burton (09:50):

So here’s the analysis I did, and this isn’t necessarily right or wrong, but this is the analysis I did. So one of the things I learned from the book, or at least one of the takeaways, and there’s a lot of things in the book, is start at the macro. And so when I arrived at the airport, my plane had already been delayed an hour. Now from experience and some understanding of how these things work, I well, first off, even before it got delayed, what I did is I looked at where’s my plane coming from? And it was coming from Newark. So I’m in Denver, it’s coming from Newark. It’s about a four-hour flight. The plane had not taken off yet. Because the plane had not taken off yet, there’s a higher probability that the plane doesn’t take off and therefore a higher probability that the plane either doesn’t arrive or arrives late.

(10:39):

Now, before my flight was delayed, the plane took off. As soon as that happened, I went, okay, there is a lower probability and see how this is constantly adjusting. There’s a lower probability that my flight gets delayed because the plane has already taken off. However, it was scheduled to arrive at our gate two minutes before I was scheduled to board. And even though they hadn’t delayed my flight yet, I instantly knew and I said to my wife, okay, I’m going to be at least 30 minutes delayed. Right. And now I live an hour from the airport. And so I immediately changed my forecast from when I would likely leave the house.

(11:20):

And sure enough, five, 10 minutes later, I get a ping, your flight’s been delayed. So as soon as a flight begins to get delayed, and I coupled that with the fact that there are this forecast for thunderstorms in the region. I went, okay, even though it’s saying I’m going to be delayed, I think it was 30 minutes at first, I said, there’s a high likelihood that that 30 minutes is going to extend out further. The question is, is that plane going to land in Denver or not? That’s the question. Because if it lands in Denver, then I have a higher probability that I’ll be able to take off. If the plane doesn’t land in Denver. So that was the calculus that was going through my mind. Ultimately, I concluded, okay, I’m going to head to the airport.

(12:04):

As I’m driving to the airport, ping comes through. Now I’m an hour delayed. I was 30 minutes. And the reason why, I looked, it said that my flight, which was supposed to arrive like 20 minutes late, now was scheduled to arrive an hour late. And as soon as I saw that, I thought, okay, there’s some sort of weather pattern that makes their models or their insights conclude that my flight, I don’t know, it’s flying around some weather patterns or something to that extent. And so this process, when I arrived at the airport, it was still an hour delayed. However, it hadn’t arrived, and it was now saying, well, no, it was an hour delayed. So I got in line, and then soon after that it said it would be an hour and a half delayed. And I went, well, hold on a second. It’s flying across the, so something’s going on. Immediately in my mind, I said, there is a relatively high, oh, and the other thing was the weather forecast was saying that the thunderstorms were supposed to clear up by 10:00 PM. This flight from Newark was supposed to land by nine.

(13:05):

And that was with the one-hour delay, which made me go, okay, there’s a higher probability now that it’s not going to be able to land, which means a higher probability has to reroute to another city, and if it reroutes to another city, that’s immediately now another couple hours of delay. And we’re already at nine o’clock at night. I was supposed to take off, I think at 9:30. So now I’m pushing to 11, 11:30, and there’s a couple things going on. Number one, I had a 6:00 AM meeting the next morning. And so the probability of me just simply choosing a different flight was going up and up and up as I continue to recalibrate my thinking as I received new information.

(13:43):

Nevertheless, I got in line, got through security, and as I get through security and I arrive at, anyway, I went to grab a bite to eat. I look and it says that the flight had landed in Grand Junction, Colorado, and soon thereafter it pushed my delayed till 12:30 at the same time, we were past 10, and these thunderstorms were not letting up. In fact, they’ve gotten much worse. And I looked at the new weather forecast and it had extended. I mean, this isn’t rocket science, but what I’m describing is that this iterative process where you start at the macro and as you receive more information, you digest that information and you adjust your thinking.

(14:25):

Now, what’s interesting, one of the characteristics of a super forecaster is that in the book, he describes it as hedgehog thinking versus fox thinking. And the hedgehog pops its head up, and it comes up with one of two conclusions, or actually one of three conclusions. Yes, no, or maybe, and too many of us live in these black and white. It’s either or. And if I don’t know, and it’s maybe. 0%, 100%, or 50%. What super forecasters do, or maybe they live in the, it’s 60% probability, 75% probability, 90% probability. Super forecasters think in degrees. So rather than 70%, they think 73%. You go, okay, that’s false precision. That may be true, but they’re at least trying to be more precise based on the information that they’re receiving, right?

Sam Carlson (15:25):

So not yes, no, maybe. Not yes, no, maybe. It’s what instead then?

Spencer Burton (15:32):

It is maybe but higher probability of yes than maybe, right? It’s running in a spectrum rather than,

Sam Carlson (15:41):

Ah, got it.

Michael Belasco (15:41):

So,

Sam Carlson (15:41):

Okay.

Michael Belasco (15:45):

So all of this analysis and this identification of what we’re calling a super forecaster from what you said, has created a marginally improved identification profiler, I guess, of a person who can predict the future better than the monkey throwing a dart and better than the average person. Is it statistically significant and worth the work? They said, okay, well, this is marginally better. We had these conversations a lot about the amount of data and the amount of things you ingest, and how much does it actually improve the decision making. I heard marginal, I’m curious to hear more about that marginal difference and what the real value is in that marginal improvement.

Spencer Burton (16:34):

So the marginal first was referring to in the first study they did where they concluded that the experts weren’t much better than the chimpanzee. There was a subset of those experts that was better to some degree that was significant. Okay.

Michael Belasco (16:55):

Okay.

Spencer Burton (16:55):

I think it was 30%, but I might be miss speaking, marginally better. But then they said, what if, is it possible and again, this was just a group of intelligence analysts, what if you broadened it to a larger set of people and you gave them the same information, not the top secret information that the intelligence analysts had but the, 1000 individuals were in this latter study. And there was a small group of them that was significantly better than the chimpanzee. And so absolutely. Now it’s not perfect by any stretch, but they were asked questions like what is the probability? Remember Gaddafi?

Michael Belasco (17:42):

Yeah. Yeah. Libya.

Spencer Burton (17:44):

Libya. So he apparently had some toxin in his blood, and,

Michael Belasco (17:44):

Oh, yeah.

Spencer Burton (17:50):

Now I’m out over my skis on this, but he apparently has some toxin in his blood. But before they knew that, they asked these super forecasters by X date, will it be discovered that Gaddafi has this chemical in his body? And so the first question is, are they even going to test for it? Right.

Michael Belasco (17:50):

Yeah.

Spencer Burton (18:13):

So if you’re going to do that sort of probability in order to find something, you have to search for it. And then if they do search for it, what’s the probability that he has it? And then there’s a question around, and again, the question is will they find it, even if he had it after some period of time, will they be able to find it? And if they do find it, what does it take for him to have been poisoned with this thing? And so those are the sorts of forecasts that they were given or by X, Y, Z date will a coup happen in this country?

Michael Belasco (18:46):

And then they were given a set of information, maybe not all like,

Spencer Burton (18:49):

No, just a question.

Michael Belasco (18:50):

Really?

Spencer Burton (18:51):

And then what they had at their disposal was the internet,

Michael Belasco (18:53):

Okay.

Spencer Burton (18:53):

Books and anything that they could search and find. And there was this group of people that were exceptional at forecasting these seemingly, how many dogs are there in Los Angeles sort of questions. And as it relates to commercial real estate, I mean, you think about what we do, and now coming back to Sam’s point about why we’re into this, is like I always talk about what we do in commercial real estate is forecast the future. The better we are at forecasting the future, the better investors we are. And so how can we become super forecasters?

Michael Belasco (19:35):

So is it about, it seems to be around any question. It’s not a setup. And I’m assuming here, and I’m curious to get clarification, it’s about creating some sort of framework to then be able to hone in and have a fluid gauge around a certain outcome. And just, it’s almost like you’re guiding in a plane in a way.

Sam Carlson (19:59):

Well, can I jump in here real quick? Only because again, going back to our car ride, you gave me like an interesting metaphor where it wasn’t a metaphor, it was just an example. So today I can look, well, I can walk out this morning, it was beautiful and sunny. I could walk out and it was beautiful and sunny, I could observe that. Right. Chances of it being beautiful and sunny at noon is probably good. The further out we go away, it gets harder to predict the future. So you’re using an example like that. So how did that correlate to financial modeling and being able to forecast the future?

Spencer Burton (20:41):

Yeah, so and that’s absolutely true. If you’re a meteorologist and you’re forecasting the temperature 10 days from now, the confidence that you have in that forecast will be much lower than if you’re forecasting the temperature 10 seconds from now, right? I mean, 10 seconds from now, you have an almost perfect forecast. And so it is in commercial real estate. I mean, imagine I said this to you, Michael, as we were talking about this subject. Imagine if with perfect confidence, you could say this building will be occupied by this tenant and that tenant will pay rent for the next 100 years. What does that mean? That means that your risk is effectively zero. And if your risk is zero, you’re willing to take a lower yield for that string of cash flows, or put another way, lower yield means you’re willing to pay more for that building.

(21:38):

And or if you had that knowledge and the market didn’t have that knowledge, you could pay less than you otherwise would’ve paid, and therefore you get a higher risk adjusted return. Your return is higher relative to the risk that you’d have because you had that ability to forecast. And I say with perfection, there’s no such thing. But in this context, in this example with perfection, you know something about that building that gives you that sort of confidence. Your risk is lower than the market is applying to it, and therefore you are getting a better risk adjusted return. And so yeah, some of it is timing based, Sam, some of it is, but again, as it relates to commercial real estate, the better we are at it, the better our investment returns, better investment results.

Sam Carlson (22:30):

And that’s what your job is. That’s exactly, the better you are at that forecasting cash flows, the better overall you will be in commercial real estate.

Spencer Burton (22:42):

If you’re investing in commercial real estate, whether it’s a developer, in an investment acquisition sort of role, I mean whatever it may be. Yes, absolutely.

Sam Carlson (22:50):

Okay.

Spencer Burton (22:51):

If you’re a developer, you’re forecasting what your costs are going to be. You’re forecasting what your rents are going to be. You’re forecasting what your operating expenses are going to be. You’re forecasting what your exit cap rate’s going to be. I mean, all of those are forecasts. And the more accurate your forecast is, the more likely you’re going to make a return. And the more confidence you have in your forecast, theoretically, the lower the risk. How do you create confidence in a development context? GMP, so you have some confidence in what your cost is. Sign lease from an investment grade tenant, so you have confidence in what your rent is, and ideally absolute triple net. So you have confidence in what your OpEx is. And so in that context, the only real input that you’re forecasting now is your exit cap rate. Right. And so that’s where your variability is.

Michael Belasco (23:40):

Your materials are all sourced. You have a high quality development team.

Spencer Burton (23:44):

Yeah. No, exactly. Yeah.

Michael Belasco (23:46):

And those are things that reduce your risk and help you get more confident.

Spencer Burton (23:50):

And if you think about reducing risk, what you’re really describing when you say reducing risk is you’re increasing the confidence in your forecast.

Sam Carlson (24:01):

That’s interesting. I mean how do you get better? So is this just all, how do we super forecast then? What is the actionable content here?

Spencer Burton (24:13):

I mean, there’s a lot more, we don’t have time to get into all of it, but there are certain characters. So some of it is, you asked this, Michael, like is this a framework that you just simply apply and you’re done? And a framework is part of it? Right. So there’s several frames. We talked about the Fermi method where you start big and you kind of compress down. Another one is called triage. So triage refers to a hospital where, or in a war environment you triage, meaning you solve the problems that are easiest to solve. And those that are more difficult to solve, unfortunately you let those pass by. And so in the context of war, that means people that are going to die, you treat last. People who are most likely to be saved, you treat first. It actually reminds me of the methodology when you’re taking multiple choice tests. Right.

Michael Belasco (25:07):

A timed multiple tests. Yeah.

Spencer Burton (25:08):

A timed multiple choice. If you don’t know the answer, you do your best to guess an answer that’s closest. And then you move on and you focus, and this was like a GMAT strategy, right, you focus on the questions that you know can answer. And so it’s with super forecasting. Right. So if you have 10 inputs and there’s one input, you have no idea. Your time spent on that input is less valuable than the time spent on other inputs that you can get closer to. So that’s the idea of triage. There’s also, so some of it is like, call it methodology. Another part is just simply characteristics of a super forecaster. And I think all of these characteristics can be learned. So one of them would be pragmatic. So super forecasters are not wedded to one idea. They don’t say, okay, this is a good market and that’s a bad market. They’re open to the possibility that a market was bad and now is good. It’s so interesting how many people I’ve run into in my career who have these embedded biases around markets or property types or strategies or tenants.

Michael Belasco (26:21):

It’s like, and I forget how the saying go, it’s like you have a belief on Tuesday, the world changes on Wednesday, and you keep the same belief on Thursday, regardless of what happened on Wednesday. Yeah, there are many people like that out there.

Spencer Burton (26:36):

Yeah.

Michael Belasco (26:37):

An unwillingness to take in new information and reevaluate a position, which is unfortunately a problem for a lot of people so.

Spencer Burton (26:47):

Yeah. And another characteristic is being analytical. Right. So rather than having a tip of the nose perspective in the book, they talk about the cable news prognosticators. Right. So these are the people that they make a living going on TV and making big predictions that never come true. And the irony is no one ever holds them to account. They make these big predictions. They don’t ever come true. They’re on CNBC and they’re saying this or that, and no one ever holds them. But the reason why they’re popular is because they make big predictions with such confidence. They’re unwavering. Everything can change, and yet they’re still stuck to their prediction. There’s a particular person, Sam knows this person who it’s like the world is always falling, and you always need to be buying gold, right?

Sam Carlson (27:44):

Yeah, exactly.

Spencer Burton (27:46):

And it’s like, okay, well, that sort of person’s right about once a decade.

Sam Carlson (27:50):

Yeah.

Spencer Burton (27:50):

And the rest of the time you kind of go, huh. But he was really right going into 2008, and therefore everyone,

Sam Carlson (27:56):

Oh, he was really right. And because he was so really right then as he continues to preach the same things, someday he’ll be really right again and he’ll look like a genius. If you look like a genius at least once, that has a long tailwind behind it, apparently.

Spencer Burton (28:14):

Yeah. And so part of it’s being analytical, being willing to consider other views. Part of it’s being humble, recognizing that your view may not always be right, being cautious. So this was one that was really interesting. So the idea of cautious isn’t that you’re afraid to take risks. Cautious is recognizing that everything you know is not certain. It can change. So just because my flight was scheduled to arrive in Denver, there’s a possibility that the plane that I was supposed to take was supposed to arrive in Denver. There’s a possibility that it wouldn’t. And being open to that possibility.

Sam Carlson (28:56):

I feel like we should tell people that you didn’t make it.

Spencer Burton (28:59):

I didn’t make it.

Sam Carlson (28:59):

He missed his flight.

Spencer Burton (29:02):

Well, what I ended up doing, and I don’t mean to toot my horn, I think a lot of it is luck too, right? It’s like, well, it’s not so much luck. You get your probability towards an end and then you take the chance. So I was thinking there’s a 70 something percent chance that my flight’s going to be delayed and I don’t want to stay up too late.

Michael Belasco (29:22):

And on top of that what’s the, and then you also get to what’s the benefit at the end of it.

Spencer Burton (29:25):

Cost benefit.

Michael Belasco (29:25):

Right.

Spencer Burton (29:25):

Yeah.

Michael Belasco (29:26):

It’s like, okay, the benefit now is I show up even later. And so, okay, even if the flight takes off and I switch, the benefit’s probably worth it now at this point. So let’s weigh that out and just,

Spencer Burton (29:38):

Well, and the cool thing that happened was, so I decided to change my flight about 15 minutes before that wave of cancellations. What that meant is I got a seat on the plane that I wanted and the seat that I wanted.

Michael Belasco (29:51):

Right.

Spencer Burton (29:53):

And I booked a hotel. And it’s funny, when I got to the hotel, I got at to the front desk and I hadn’t actually realized kind of what had happened. And the woman was like, oh, you’re lucked out. We got sold out. I’m like, what happened? And she’s like, oh yeah, we had 100 rooms and in 60 seconds, all 100 rooms were booked up.

Michael Belasco (30:12):

Wow.

Spencer Burton (30:13):

And that’s because there was this wave of cancellations that instantly hit. I had just made it just, again, a good part of it is luck, but some of it is like this philosophy of forecasting that worked in my favor. And I checked in,  I went out to my car, parked my car, and when I came back in this big bus from the airport had arrive and the lobby was full of people. I mean, it was quite fortunate so.

Michael Belasco (30:38):

Wow.

Sam Carlson (30:40):

Well, I can say Spencer is actually like this. He does do this. This is his real behavior, which we absolutely love. He uses super forecasting in his daily life, and I think it’s phenomenal. And to be better, you’re getting better not only at modeling and forecasting the future, but you’re also using it to save you some flight logistics, and travel time. So I think that’s great.

Spencer Burton (31:07):

Yeah, highly recommend the book, Super Forecasting. Go check it out. Philip Tetlock, Dan Gardner, phenomenal book. Brandon Talman, who’s a data scientist at Stablewood, it’s his favorite book. He recommended it to me and it’s now my favorite book right now. It’s fascinating. So go check it out. And I think if you use it in commercial real estate, really makes you a better professional, better investor.

Sam Carlson (31:37):

Awesome. All right guys, thanks for watching. We’ll see you on the next episode.