Argumentum ab scientia is not argumentum baculo: The difference between authority and expertise

May 7, JDN 2457881

Americans are, on the whole, suspicious of authority. This is a very good thing; it shields us against authoritarianism. But it comes with a major downside, which is a tendency to forget the distinction between authority and expertise.

Argument from authority is an informal fallacy, argumentum baculo. The fact that something was said by the Pope, or the President, or the General Secretary of the UN, doesn’t make it true. (Aside: You’re probably more familiar with the phrase argumentum ad baculum, which is terrible Latin. That would mean “argument toward a stick”, when clearly the intended meaning was “argument by means of a stick”, which is argumentum baculo.)

But argument from expertise, argumentum ab scientia, is something quite different. The world is much too complicated for any one person to know everything about everything, so we have no choice but to specialize our knowledge, each of us becoming an expert in only a few things. So if you are not an expert in a subject, when someone who is an expert in that subject tells you something about that subject, you should probably believe them.

You should especially be prepared to believe them when the entire community of experts is in consensus or near-consensus on a topic. The scientific consensus on climate change is absolutely overwhelming. Is this a reason to believe in climate change? You’re damn right it is. Unless you have years of education and experience in understanding climate models and atmospheric data, you have no basis for challenging the expert consensus on this issue.

This confusion has created a deep current of anti-intellectualism in our culture, as Isaac Asimov famously recognized:

There is a cult of ignorance in the United States, and there always has been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that “my ignorance is just as good as your knowledge.”

This is also important to understand if you have heterodox views on any scientific topic. The fact that the whole field disagrees with you does not prove that you are wrong—but it does make it quite likely that you are wrong. Cranks often want to compare themselves to Galileo or Einstein, but here’s the thing: Galileo and Einstein didn’t act like cranks. They didn’t expect the scientific community to respect their ideas before they had gathered compelling evidence in their favor.

When behavioral economists found that neoclassical models of human behavior didn’t stand up to scrutiny, did they shout from the rooftops that economics is all a lie? No, they published their research in peer-reviewed journals, and talked with economists about the implications of their results. There may have been times when they felt ignored or disrespected by the mainstream, but they pressed on, because the data was on their side. And ultimately, the mainstream gave in: Daniel Kahneman won the Nobel Prize in Economics.

Experts are not always right, that is true. But they are usually right, and if you think they are wrong you’d better have a good reason to think so. The best reasons are the sort that come about when you yourself have spent the time and effort to become an expert, able to challenge the consensus on its own terms.

Admittedly, that is a very difficult thing to do—and more difficult than it should be. I have seen firsthand how difficult and painful the slow grind toward a PhD can be, and how many obstacles will get thrown in your way, ranging from nepotism and interdepartmental politics, to discrimination against women and minorities, to mismatches of interest between students and faculty, all the way to illness, mental health problems, and the slings and arrows of outrageous fortune in general. If you have particularly heterodox ideas, you may face particularly harsh barriers, and sometimes it behooves you to hold your tongue and toe the lie awhile.

But this is no excuse not to gain expertise. Even if academia itself is not available to you, we live in an age of unprecedented availability of information—it’s not called the Information Age for nothing. A sufficiently talented and dedicated autodidact can challenge the mainstream, if their ideas are truly good enough. (Perhaps the best example of this is the mathematician savant Srinivasa Ramanujan. But he’s… something else. I think he is about as far from the average genius as the average genius is from the average person.) No, that won’t be easy either. But if you are really serious about advancing human understanding rather than just rooting for your political team (read: tribe), you should be prepared to either take up the academic route or attack it as an autodidact from the outside.

In fact, most scientific fields are actually quite good about admitting what they don’t know. A total consensus that turns out to be wrong is actually a very rare phenomenon; much more common is a clash of multiple competing paradigms where one ultimately wins out, or they end up replaced by a totally new paradigm or some sort of synthesis. In almost all cases, the new paradigm wins not because it becomes fashionable or the ancien regime dies out (as Planck cynically claimed) but because overwhelming evidence is observed in its favor, often in the form of explaining some phenomenon that was previously impossible to understand. If your heterodox theory doesn’t do that, then it probably won’t win, because it doesn’t deserve to.

(Right now you might think of challenging me: Does my heterodox theory do that? Does the tribal paradigm explain things that either total selfishness or total altruism cannot? I think it’s pretty obvious that it does. I mean, you are familiar with a little thing called “racism”, aren’t you? There is no explanation for racism in neoclassical economics; to understand it at all you have to just impose it as an arbitrary term on the utility function. But at that point, why not throw in whatever you please? Maybe some people enjoy bashing their heads against walls, and other people take great pleasure in the taste of arsenic. Why would this particular self- (not to mention other-) destroying behavior be universal to all human societies?)

In practice, I think most people who challenge the mainstream consensus aren’t genuinely interested in finding out the truth—certainly not enough to actually go through the work of doing it. It’s a pattern you can see in a wide range of fringe views: Anti-vaxxers, 9/11 truthers, climate denialists, they all think the same way. The mainstream disagrees with my preconceived ideology, therefore the mainstream is some kind of global conspiracy to deceive us. The overwhelming evidence that vaccination is safe and (wildly) cost-effective, 9/11 was indeed perpetrated by Al Qaeda and neither planned nor anticipated by anyone in the US government , and the global climate is being changed by human greenhouse gas emissions—these things simply don’t matter to them, because it was never really about the truth. They knew the answer before they asked the question. Because their identity is wrapped up in that political ideology, they know it couldn’t possibly be otherwise, and no amount of evidence will change their mind.

How do we reach such people? That, I don’t know. I wish I did. But I can say this much: We can stop taking them seriously when they say that the overwhelming scientific consensus against them is just another “appeal to authority”. It’s not. It never was. It’s an argument from expertise—there are people who know this a lot better than you, and they think you’re wrong, so you’re probably wrong.

Why being a scientist means confronting your own ignorance

I read an essay today arguing that scientists should be stupid. Or more precisely, ignorant. Or even more precisely, they should recognize their ignorance when all others ignore and turn away.

What does it feel like to be wrong?

It doesn’t feel like anything. Most people are wrong most of the time without realizing it. (Explained brilliantly in this TED talk.)

What does it feel like to be proven wrong, to find out you were confused or ignorant?

It hurts, a great deal. And most people flinch away from this. They would rather continue being wrong than experience the feeling of being proven wrong.

But being proven wrong is the only way to become less wrong. Being proven ignorant is the only way to truly attain knowledge.

I once heard someone characterize the scientific temperament as “being comfortable not knowing”. No, no, no! Just the opposite, in fact. The unscientific temperament is being comfortable not knowing, being fine with your infinite ignorance as long as you can go about your day. The scientific temperament is being so deeply  uncomfortable not knowing that it overrides the discomfort everyone feels when their beliefs are proven wrong. It is to have a drive to actually know—not to think you know, not to feel as if you know, not to assume you know and never think about it, but to actually know—that is so strong it pushes you through all the pain and doubt and confusion of actually trying to find out.

An analogy I like to use is The Armor of Truth. Suppose you were presented with a piece of armor, The Armor of Truth, which is claimed to be indestructible. You will have the chance to wear this armor into battle; if it is indeed indestructible, you will be invincible and will surely prevail. But what if it isn’t? What if it has some weakness you aren’t aware of? Then it could fail and you could die.

How would you go about determining whether The Armor of Truth is really what it is claimed to be? Would you test it with things you expect it to survive? Would you brush it with feathers, pour glasses of water on it, poke it with your finger? Would you seek to confirm your belief in its indestructibility? (As confirmation bias would have you do?) No, you would test it with things you expect to destroy it; you’d hit it with everything you have. You’re fire machine guns at it, drop bombs on it, pour acid on it, place it in a nuclear testing site. You’d do everything you possibly could to falsify your belief in the armor’s indestructibility. And only when you failed, only after you had tried everything you could think of to destroy the armor and it remained undented and unscratched, would you begin to believe that it is truly indestructible. (Popper was exaggerating when he said all science is based on falsification; but he was not exaggerating very much.)

Science is The Armor of Truth, and we wear it into battle—but now the analogy begins to break down, for our beliefs are within us, they are part of us. We’d like to be able to point the machineguns at armor far away from us, but instead it is as if we are forced to wear the armor as the guns are fired. When a break in the armor is found and a bullet passes through—a belief we dearly held is proven false—it hurts us, and we wish we could find another way to test it. But we can’t; and if we fail to test it now, it will only endanger us later—confront a false belief with reality enough and it will eventually fail. A scientist is someone who accepts this and wears the armor bravely as the test guns blaze.

Being a scientist means confronting your own ignorance: Not accepting it, but also not ignoring it; confronting it. Facing it down. Conquering it. Destroying it.