Patrick Carroll
Patrick Carroll has a degree in chemical engineering from the University of Waterloo and is an editorial fellow at the Foundation for Economic Education.
If someone were to ask you how you know Australia exists, what would you say? If you haven’t been there yourself, it can be a surprisingly difficult question to answer. You might point to your grade school geography teacher who first told you about the country. “I know it exists because my teacher told me it exists,” you might say. Alternatively, you might mention a friend of yours who has visited the country and can testify to its existence. Finally, you might point out that you’ve consulted an atlas and confirmed that the country does, in fact, appear on the map.
Though each of these justifications may sound compelling, they all lean on a critical linchpin: trust. “Trust me,” your teacher says, “I’ve looked into this.” “Trust me,” your friend says, “I saw it with my own eyes.” “Trust us,” the atlas publishers say, “we’ve consulted the experts.”
To be sure, these are often trustworthy sources, but it’s important to acknowledge that in some fundamental sense you are choosing to believe what others have told you. If you’ve never verified it yourself, you don’t really know Australia exists, you just trust that it exists.
Trust and Authority
The reason this question is important is that it reveals to us just how much we defer to authority in our thinking. It’s easy to regard ourselves as being incredibly knowledgeable, but if we’re honest, it’s much more that we’re incredibly trusting. We have accepted what “authority” has told us in just about every domain, and with very little pushback.
CS Lewis drew attention to this phenomenon in his book Mere Christianity. Indeed, the following passage is what inspired the present article.
Do not be scared by the word authority. Believing things on authority only means believing them because you have been told them by someone you think is trustworthy. Ninety-nine per cent of the things you believe are believed on authority. I believe there is such a place as New York. I have not seen it myself. I could not prove by abstract reasoning that there must be such a place. I believe it because reliable people have told me so. The ordinary man believes in the Solar System, atoms, evolution, and the circulation of the blood on authority – because the scientists say so. Every historical statement in the world is believed on authority. None of us has seen the Norman Conquest or the defeat of the Armada. None of us could prove them by pure logic as you prove a thing in mathematics. We believe them simply because people who did see them have left writings that tell us about them: in fact, on authority. A man who jibbed at authority in other things as some people do in religion would have to be content to know nothing all his life.
As Lewis points out, there’s nothing wrong with believing things on authority. We do it all the time, and it helps us make our way in the world.
But while there’s nothing inherently wrong with trusting various sources, I’d argue we tend to be a bit too trusting as a culture. We take authorities at their word, even when we probably shouldn’t.
The whole Covid fiasco is surely a great example of this. How much evidence did it take to convince the average person to get the vaccine? Embarrassingly little. People also bought into lockdowns and mask mandates simply because some “experts” said these policies were a good idea.
The climate change issue is another great example of how much we put blind trust in intellectual authorities. Since most of us have no expertise in the matter, we resign ourselves to taking the experts at their word. But it’s OK, we are assured, because “97 per cent of climate scientists agree”. Since we know there’s a “consensus” we can trust them, right?
Not so fast. Ask yourself, do you really know there’s a 97 per cent consensus? Did you look at the raw data regarding expert’s opinions yourself? If you haven’t, then on this too you are deferring to authority. You are trusting the source of that 97 per cent figure. Specifically, you are trusting that the people who came up with that figure aren’t misleading you and that their collection and representation of the data on expert’s views is reasonable, unbiased, accurate and complete.
Remember, you don’t actually know that 97 per cent of climate scientists agree, you trust that 97 per cent of climate scientists agree. (As it happens, this figure is more dubious than most people realize).
Again, there’s nothing wrong with trust. But we need to be careful of trusting too easily, because things are not always what they are reported to be.
‘Citation Needed’ Culture
So how can we avoid trusting too easily? My proposal is that we adopt what I call “citation needed” culture.
As the name implies, the idea here is to create a culture where we habitually demand evidence, especially for contentious ideas. Any time someone makes a claim, your instinctive response should be “citation needed”.
Growing up, we learned to take things at face value, to take the teacher at their word. But this is a bad habit, one we would do well to abandon. Especially as adults, we need to adopt a healthy scepticism and question everything, even the things everyone seems to agree on.
“Citation needed” culture is also about getting as close to the primary source as possible so as to minimize how many people you need to trust. When you get your information from politicians, the chain of trust is likely politician-journalist-scientist-data. That’s a lot of opportunity for distortion (intentional or otherwise). If you can, it’s better to go straight to the scientist, or better yet, the raw data itself (assuming you can interpret it).
Another part of “citation needed” culture is intellectual humility. No matter how ‘obvious’ or ‘self-evident’ something seems, if your claim comes down to ‘I’m trusting an authority’ then you probably shouldn’t be too dogmatic about it. This is especially pertinent for heterodox ideas like conspiracy theories. Did the Holodomor happen? I think so, but I haven’t looked into it for myself. I’m trusting the people who have, just as much as I’m trusting the geographers who tell me Australia exists.
The problem is that people often argue dogmatically for claims on the basis that ‘everyone knows’ this is true, or that ‘experts agree’ this is true. But appeals to majority or authority don’t fly in “citation needed” culture. Show me the receipts, and then I’ll believe you.
In addition to the phrase “citation needed,” then, the other phrase that should be a common refrain is ‘I don’t have enough knowledge to have an informed opinion on this.’ It’s far better to admit ignorance than to pretend you know something when you really just heard it on TV.
Murray Rothbard put it well, commenting on the field of economics. “It is no crime to be ignorant of economics,” he said, “which is, after all, a specialized discipline and one that most people consider to be a ‘dismal science’. But it is totally irresponsible to have a loud and vociferous opinion on economic subjects while remaining in this state of ignorance.”
The same goes for every other field, whether it be history, climate science, infectious diseases or geography. Trust authorities to your heart’s content if you like, but beware of confusing trust with knowledge.
This article was adapted from an issue of the FEE Daily email newsletter. Click here to sign up and get free-market news and analysis like this in your inbox every weekday.
This article was originally published on FEE.org. Read the original article.