Nov 26 JDN 2460275
The basic precepts of reason seem obvious and irrefutable:
Believe what’s most likely to be true.
Do what’s most likely to work.
How are you going to argue with that? In fact, it seems like by the time you try to argue at all, you’ve already agreed to it. These principles may be undeniable—literally impossible to coherently deny.
Even when expressed a little more precisely, the principles of reason still seem pretty obvious:
Beliefs should be consistent with each other and with observations.
The best action is the one with the best expected outcome.
And you really can get surprisingly far with this. A few more steps of mathematical precision, and you basically get the scientific method and utilitarianism:
Beliefs should be assigned consistent Bayesian probabilities according to the observed evidence.
The best action is the one that maximizes expected utility.
Why, then, did it take humanity 99.9% of its existence to figure this out? Why did a species that has lived for 300,000 years only really start getting this right in about the past 300?
In fact, even today, while most people would at least assent to the basic notion of rationality, a large number don’t really follow it well, and only a small fraction really understand it at the deepest level.
Reason just seems obvious if you think about it. How do so many people miss it?
Because most people really don’t think about it that much.
In fact, I’m going to make a stronger claim:
Most people don’t think about anything that much.
Remember: To a first approximation, all human behavior is social norms.
Most human beings go through most of their lives behaving according to habits and social norms that they may not even be consciously aware of. They do things how they were always done; they believe what those around them believe. They adopt the religion of their parents, cheer for the sports team of their hometown, vote for the political party that is popular in their community. They may not even register these things as decisions at all—they simply did not consider the alternatives.
It’s not that they are incapable of thinking. When they really need to think hard about something, they can do it. But hard thinking is, well, hard. It’s difficult; it’s uncomfortable; for most people, it’s unfamiliar. So, they avoid it when they can. (There is even a kind of meta-rationality in that: Behavioral economists call it rational inattention.)
Few would willingly assent to the claim “I believe a lot of things that aren’t true.” People generally believe that their beliefs are true.
I doubt even most people in ancient history would agree with a statement like that. People who wholemindedly believed in witches, werewolves, ghosts, and sympathetic magic still believed that their beliefs were true. People who thought that a giant beetle rolled the sun across the sky still thought they had a good handle on how the world works.
In fact, the few people I know who would agree with a statement like that are very honest, introspective Bayesians who recognize that the joint probability of all their beliefs being true must be quite small. Agreeing that some of your beliefs are false is a sign not that you are irrational, but that you are extremely rational. (In fact, I would agree with a statement like that: If I knew what I’m wrong about, I’d change my belief; but odds are, I’m wrong about something.)
But most people simply don’t even bother to evaluate the truth of many of their beliefs. If something is easy to check and directly affects their lives, they’ll probably try to gather evidence for it. But if it’s at all abstract or difficult to evaluate, they’ll more or less give up and believe whatever seems to be popular. (This explains Carlin’s dictum: “Tell people there’s an invisible man in the sky who created the universe, and the vast majority will believe you. Tell them the paint is wet, and they have to touch it to be sure.“)
This can also help to explain why so many people—mostly, but not exclusively right-wing people—complain that scientists are “elitist” while worshipping at the feet of clergy and business executives (the latter only—so far—figuratively, but the former all too literally).
What could be more elitist than clergy? They are basically claiming a special, unique connection to the ultimate truths of the universe that is only accessible to them. They claim to be ordained by the all-powerful ruler of the universe with the absolute to right adjudicate all truth and morality.
For goodness’ sake, one of the most popular and powerful ones literally claims to be infallible.
Meanwhile, basically all scientists agree that anyone who is reasonably smart and willing to work hard, either making their own observations, running their own experiments, or just reading the work of a lot of other people’s observations and experiments, can become a scientist. Some scientists are arrogant or condescending, but as an institution and culture, science is fundamentally egalitarian.
No, what people are objecting to among scientists is not elitism. Part of it may be the condescension of telling people: “This is obvious. If you thought about it, you would see that it has to be right.”
Yet the reason we keep saying that is… it is basically true. The precepts of rationality are obvious if think about them, and they do lead quite directly to rejecting a lot of mainstream beliefs, particularly about religion. I’m sure it feels insulting to be told that you just aren’t thinking hard enough about important things… but maybe you aren’t?
We may need to find a gentler way to convey this message. There’s no point in saying it if nobody is going to listen. Yet that doesn’t make it any less true.
It’s not that quantum mechanics is intuitively obvious (quite the opposite is still a terrible understatement), nor even that Darwinian natural selection or comparative advantage are obvious (though surely they’re less counter-intuitive than quantum mechanics). The conclusions of science are not obvious. They took centuries to figure out for good reason.
But the principles of science really are: Want to know if something is true? Look! Find out!
Yet historically this has not in fact been how human beings formed most of their beliefs. Indeed, I am often awed by just how bad most people throughout history have been at thinking empirically.
It’s not just that people throughout history believed in witches without ever having seen one, or knowing anyone who had seen one. (I’ve never seen a platypus or a quasar, and I still believe in them.) It’s that they were willing to execute people for being witches—killing people as punishment for deeds that not only they did not do, but could not possibly have done. Entire civilizations for millennia failed to realize that this was wrong.
Aristotle believed that men’s body temperature was hotter than women’s, and that this temperature difference determined the sex of children. That’s Aristotle, a certifiable genius living in the culture that pioneered rationalist philosophy. (Ironically—and by pure Stopped Clock Principle—he’d almost be right about certain species of reptiles.) It never occurred to him to even try to measure the body temperatures of lots of people and see if this was true. (Admittedly they didn’t have very good thermometers back then.)
Aristotle did get a lot of things right: In particular, his trichotomy of souls is basically accurate, with “vegetative soul” renamed “homeostatic metabolism and reproduction”, “sensitive soul” renamed “limbic system”, and “rational soul” renamed “prefrontal cortex”. The vegetative soul is what makes you alive, the sensitive soul is what makes you sentient, and the rational soul is what makes you a person. He even recognized a deep truth that the majority of human beings today do not: The soul is a function of the body, and dies when the body dies. For his time, he was absolutely off the charts in rationality. But even he didn’t really integrate rationality and empiricism fully into his way of thinking.
Even today there are a shocking number of common misconceptions that could be easily refuted by anyone who thought to check (or look it up!):
Wolves howl at the full moon? Nope, wolves don’t care about the phase of the moon, and if you live near any, you’ll hear them howl all year round. Actually, wolf howling is more like that “Twilight Bark” from 101 Dalmations; it’s a long-distance communication and coordination signal.
Eggs can only balance on the equinox? Nope, it’s tricky, but you can balance an egg just as well any day of the year.
You don’t lose most of your heat through your head: Try going outside in the cold wearing a t-shirt and shorts with a hat, and then again with snow pants and a heavy coat and no hat; you’ll see which feels colder.
“Beer before liquor, never sicker” is nonsense: It matters how much alcohol you drink (and how much you eat), not what order you do it in, and you’d know that if you just tried it both ways a few times.
Taste on your tongue is localized to particular areas? No, it’s not, and you can tell by putting foods with strong flavors on different parts of your tongue. (Indeed, I did when they did that demonstration in elementary school; I wondered if that meant my tongue was somehow weird.)
I can understand not wanting to take the risk with fan death yourself, but maybe listen to all the other people—including medical experts—who tell you it’s not real? I keep a fan in my bedroom every night and it hasn’t killed me yet.
Even the gambler’s fallacy is something you could easily disabuse yourself of by rolling some dice for awhile and taking careful notes. Am I more likely to roll snake eyes if I haven’t in awhile? Nope; the odds on any given roll are always exactly the same.
But most people simply don’t think to check.
Indeed, most people get a lot of their beliefs—particular those about complex, abstract, or distant things—from authority figures. While empiricism doesn’t come very naturally to humans, hierarchy absolutely does. (I think it’s a primate thing.) Another reason scientists may seem “elitist” is that people think we are trying to usurp that authority. We’re telling you that what your religious leaders taught you is false; that must mean that we are trying to become religious leaders ourselves.
But in fact we’re telling you something far more radical than that: You don’t need religious leaders. You don’t need to take things on faith. If you want to know whether something is true, you can look.
We are not trying to usurp control over your hierarchy. We are trying to utterly dismantle it. We dethrone the king, not so that we can become kings ourselves—but so that the world can have kings no longer.
Granted, most people aren’t going to be able to run particle accelerator experiments in their garages. But if you want to know how particle physics works, and how we know what we know about it, go to your nearest university, find a particle physicist, and ask: I guarantee they’ll be more than happy to tell you whatever you want to know. You can even do this via email from anywhere in the world.
That is, we do need expertise: People who specialize in a particular field of knowledge can learn it much better than others. But we do not need authority: You don’t just have to take their word for it. There’s a difference between expertise and authority.
And sometimes, really all you need to do is stop and think. People should try that more often.