Navigating Uncertainty – Views from a Road Trip
Aug 14, 2023
The Bay Area's Rule and Bayes Rule
Alt title: Life Advice from a Bayesian Lens
There are a lot of things to love about California – public transit isn’t one of them. The first rule when trying to get around the Bay Area? Get a car. This meant that when my family visited here from overseas for a holiday this past spring, someone would have to drive. And since I was the only one with a US license, that someone would have to be me. There was just one speed bump – at that point in my twenties, I had driven a cumulative fifty miles on American roads over the three lessons I needed before my singular (successful, mind you) attempt at the DMV behind-the-wheel test. It would be one thing to drive within Palo Alto – it was another entirely to traverse the distance from Napa to LA with a layover in Yosemite.
Naturally, no part of me really believed I was up to the task.
Beliefs we hold are a lot like hypotheses in data science. In fact, how much we believe something is nothing but our assessment of that something’s probability. And there is no subject on which we hold more beliefs than ourselves. I am a good person. I am possibly smart. I can’t do this.
So where do our beliefs come from? Many we are raised with. I’ve got to go to college.
A number we discover as we go along. I’m good at math.
At some stage, as we experience fewer novel things, it feels like we have crystallized our beliefs, especially about ourselves. I’m not a person who drives.
Beliefs are how we make decisions in an uncertain world.
When it comes to reasoning with uncertainty, Bayesian inference introduces two helpful concepts. There is the prior, what Wikipedia calls “an estimate of the probability of a given hypothesis before data is observed”. This is a lot like the innate beliefs we hold, such as my high-school conviction that going to college was good (despite not yet having gone). The more interesting concept is the posterior, which is an updated assessment of our conviction in a hypothesis based on data we observe. Having gone to college, how good was it after all? Quite good, made friends and learned some things, 9/10, would recommend.
Being Bayesian: Overcoming self limiting beliefs with action
In the real world, a lot of “observed data” comes down to doing things: talking to people, trying new experiences, even having old experiences anew. And just like with statistics, the certainty afforded by this “observed data” scales with sample size. The more evidence we have of something, the greater its impact on the posterior, no matter what prior you began with.
In other words, no matter the level of belief in yourself as a surfer, the more often you successfully ride a wave when you surf
, the more it shapes your eventual conviction of your surfing identity. You hear about this from the power-suit clad executives at TED conferences talking about a growth mindset and the Patagonia-clad serial founders talking about why you should bias to action. To me, this always felt a little wishy-washy, especially having come from academic backgrounds that push you to critically reason – and thus accumulate information and conviction – before you act. But when expressed in a Bayesian manner, it’s undeniable that the inverse is valid: actions we take can not just influence but in fact update our preconceived convictions, or ,priors
. Especially about ourselves.
With three weeks to go, and seemingly no way out with their tickets already booked, it was clear no amount of DMV mastery would help me. I decided to do what I could: I signed up for lessons to accumulate some miles on the road. The lessons didn’t make me better overnight, but they did nudge my beliefs from no way to somewhere closer to maaaybe, but this is still insane.
We all come to life with different priors. That’s your baseline – it’s what you got. Now the actions you take over the rest of your life give you the opportunity to create a posterior that better reflects who you want to be. And just like with Bayesian statistics, the more actions you take, the more evidence you’re able to collect and the more influence you’re able to exert on shaping your posterior.
And so it was with the roadtrip. Each mile on the ride influenced how good I thought the idea was. What seemed like a preposterous idea three weeks out slowly became more realistic as my conviction on the road improved.
The grain of truth problem: giving yourself a chance
In statistics as in life, a Bayesian approach can fall apart. In the former, this happens when your prior assesses a specific outcome as impossible. If you do, famously, no amount of observed data can shift your posterior beliefs on that outcome from the prior. If your prior doesn’t contain a so-called grain of truth
, no amount of data will help your posterior capture it.
In the real world, this sounds a lot like not allowing yourself to believe something, even a little bit. People from a different political persuasion might be good. Maybe I can finish a triathlon.
In life as in statistics, having just a small amount of belief in something is essential, and without that ounce of belief, no actions can help. On the other hand, if you have some belief – no matter how small – that you can trust in yourself, that you are competent, then if you find enough data, or take enough action, to back that belief, your posterior will shift to capture it.
This applies to other people too. If you’re never willing to give someone a chance (especially because of who they are or what their identity is), no matter what they do, it won’t be enough.
When it comes to ourselves, this tells me that we should choose to believe the good things about ourselves, even if we can’t find immense conviction in those beliefs. We should choose to believe that we are good. That we are smart. That we can drive on those California highways.
And then in statistics as in life, we should take action to eventually help us find enough conviction in those beliefs.
Misjudging expectations: focusing on what you love
OPTIONAL EXTENSION if I haven’t yet put you to sleep When making choices, we’re often influenced by the outcomes we expect. This leads us to expected values, another bedrock of probability. Expected values give a sense of what to expect by combining how likely a given event is with the event itself. I love going out with good company, and because it’s likely I’ll enjoy the company of my friends, I expect to have a great night out. This would be the expected utility of a night out with friends.
But in a world of shifting probabilities, using priors to estimate our expected values can lead us astray. In other words, if I think founding a company would be very fulfilling, but because, having never done it before, I believe I’m unlikely to succeed, this might lead me to conclude it’s a terrible idea to found a company, all things considered. But in the real world, every new day for a first-time founder is an opportunity to take actions (“collect observations”) that influence their own perceptions (posterior belief). Somewhere along the way, a posterior might shift enough to lead you to a wildly different expected (or realized) value – one that might make an endeavor that looked dubious to begin with, extremely likely to be fulfilling.
Tantalizingly, if you back yourself to get better at something, it means the expected joy from your endeavors eventually shifts closer to how much you purely cherish something, regardless of your ability to succeed. This sounds a lot like the advice belabored on too many college graduations – do what you love.
So am I sold on the hokey advice from Linkedinfluencers about biasing to action, believing in yourself, and doing what you love? I’m a data scientist, so I won’t say for sure. But I can tell you that the data from my California road trip suggests we get more mileage from our actions than our beliefs.
Share