Bayes' Rule is a formalization of how to change your mind when you learn new information about the world or have new experiences.
Transcript -- I'd like to introduce you to a particularly powerful paradigm for thinking called Bayes' Rule. Back in the Second World War the then governor of California, Earl Warren, believed that Japanese Americans constituted a grave threat to our national security. And as he was testifying as much to Congress, someone brought up the fact that, you know, we haven't seen any signs of subterfuge from the Japanese American community. And Warren responded that, "Ah, this makes me even more suspicious. This is an even more ominous sign because that indicates that they're probably planning some major secret timed to attack á la Pearl Harbor. And this convinces me even more that the Japanese Americans are a threat."
So this pattern of reasoning is what sustains most conspiracy theories. You see signs of a cover up -- well, that just proves that I was right all along about the cover up. You don't see signs of a cover up, well that just proves that the cover up runs even deeper than we previously suspected.
Bayes' Rule is probably the best way to think about evidence. In other words, Bayes' Rule is a formalization of how to change your mind when you learn new information about the world or have new experiences. And I don't think that the math behind -- the math of Bayes' Rule is crucial to getting benefit out of it in your own reasoning or decision making. In fact, there are plenty of people who use Bayes' Rule on a daily basis in their jobs -- statisticians and scientists for example. But then when they leave the lab and go home, they think like non-Bayesians just like the rest of us.
So what's really important is internalizing the intuitions behind Bayes' Rule and some of the general reasoning principles that fall out of the math. And being able to use those principles in your own reasoning.
After you've been steeped in Bayes' Rule for a little while, it starts to produce some fundamental changes to your thinking. For example, you become much more aware that your beliefs are grayscale, they're not black and white. That you have levels of confidence in your beliefs about how the world works that are less than one hundred percent but greater than zero percent. And even more importantly, as you go through the world and encounter new ideas and new evidence, that level of confidence fluctuates as you encounter evidence for and against your beliefs.
Also I think that many people, certainly including myself, have this default way of approaching the world in which we have our preexisting beliefs and we go through the world and we pretty much stick to our beliefs unless we encounter evidence that's so overwhelmingly inconsistent with our beliefs about the world that it forces us to change our minds and, you know, adopt a new theory of how the world works. And sometimes even then we don't do it.
So the implicit question that I'm asking myself that people ask themselves as they go through the world is when I see new evidence, can this be explained with my theory. And if yes, then we stop there. But, after you've got some familiarity with Bayes' Rule what you start doing is instead of stopping after asking yourself can this evidence be explained with my own pet theory, you also ask well, would it be explained better with some other theory or maybe just as well with some other theory. Is this actually evidence for my theory.
Produced/Directed by Jonathan Fowler and Dillon Fitton
Transcript -- I'd like to introduce you to a particularly powerful paradigm for thinking called Bayes' Rule. Back in the Second World War the then governor of California, Earl Warren, believed that Japanese Americans constituted a grave threat to our national security. And as he was testifying as much to Congress, someone brought up the fact that, you know, we haven't seen any signs of subterfuge from the Japanese American community. And Warren responded that, "Ah, this makes me even more suspicious. This is an even more ominous sign because that indicates that they're probably planning some major secret timed to attack á la Pearl Harbor. And this convinces me even more that the Japanese Americans are a threat."
So this pattern of reasoning is what sustains most conspiracy theories. You see signs of a cover up -- well, that just proves that I was right all along about the cover up. You don't see signs of a cover up, well that just proves that the cover up runs even deeper than we previously suspected.
Bayes' Rule is probably the best way to think about evidence. In other words, Bayes' Rule is a formalization of how to change your mind when you learn new information about the world or have new experiences. And I don't think that the math behind -- the math of Bayes' Rule is crucial to getting benefit out of it in your own reasoning or decision making. In fact, there are plenty of people who use Bayes' Rule on a daily basis in their jobs -- statisticians and scientists for example. But then when they leave the lab and go home, they think like non-Bayesians just like the rest of us.
So what's really important is internalizing the intuitions behind Bayes' Rule and some of the general reasoning principles that fall out of the math. And being able to use those principles in your own reasoning.
After you've been steeped in Bayes' Rule for a little while, it starts to produce some fundamental changes to your thinking. For example, you become much more aware that your beliefs are grayscale, they're not black and white. That you have levels of confidence in your beliefs about how the world works that are less than one hundred percent but greater than zero percent. And even more importantly, as you go through the world and encounter new ideas and new evidence, that level of confidence fluctuates as you encounter evidence for and against your beliefs.
Also I think that many people, certainly including myself, have this default way of approaching the world in which we have our preexisting beliefs and we go through the world and we pretty much stick to our beliefs unless we encounter evidence that's so overwhelmingly inconsistent with our beliefs about the world that it forces us to change our minds and, you know, adopt a new theory of how the world works. And sometimes even then we don't do it.
So the implicit question that I'm asking myself that people ask themselves as they go through the world is when I see new evidence, can this be explained with my theory. And if yes, then we stop there. But, after you've got some familiarity with Bayes' Rule what you start doing is instead of stopping after asking yourself can this evidence be explained with my own pet theory, you also ask well, would it be explained better with some other theory or maybe just as well with some other theory. Is this actually evidence for my theory.
Produced/Directed by Jonathan Fowler and Dillon Fitton
- Category
- Social
Be the first to comment