Nobody Knows What’s Going On
A major online publication once reported in a profile on me that I had retired at 33. A few old friends and acquaintances reached out to congratulate me on my financial independence.
I think it was an honest mistake on the part of the reporter. I told her I had quit my job to write full time, and I guess she thought that meant I must have millions of dollars.
To be clear, I was not then, and am not now financially independent. The 100 or so people that actually know me could discern that just by seeing my kitchen. Yet perhaps 20,000 people read somewhere that I am. That means potentially 200 times more people are wrong than right on this question, because of an inference made by a reporter.
This scenario, in which there’s much more wrongness going around than rightness, is probably the norm. People make bad inferences like that all day long. These wrong ideas replicate themselves whenever the person tells someone else what they know, which the internet makes easier than ever.
Consider the possibility that most of the information being passed around, on whatever topic, is bad information, even where there’s no intentional deception. As George Orwell said, “The most fundamental mistake of man is that he thinks he knows what’s going on. Nobody knows what’s going on.”
Technology may have made this state of affairs inevitable. Today, the vast majority of a person’s worldview is assembled from second-hand sources, not from their own experience. Second-hand knowledge, from “reliable” sources or not, usually functions as hearsay – if it seems true, it is immediately incorporated into one’s worldview, usually without any attempt to substantiate it. Most of what you “know” is just something you heard somewhere.
Standard editorial approach
When people go on to share what they “know”, there’s usually no penalty for being wrong, but there are rewards for convincing people you’re right: attention, money, adoration, public rhetorical victories over others, and many other things humans enjoy.
The Two Kinds of Knowing
First-hand knowledge is a whole different thing from the second-hand kind. When you experience an event with your senses, you’re not just accepting a verbal claim, such as “There’s fighting in the streets of Kabul” — the truth is actually happening to you. The experienced sailor knows that looming cloud formation means trouble. The soldier knows the attack on his unit’s position was repelled. The dog owner knows exactly how long she can leave Rocco alone at home before he relieves himself on the floor. The friend sitting on my fifteen-year-old couch knows I’m not independently wealthy. Experience imprints reality right into your neurons; it doesn’t just add another thought to the abstract space in your brain where you keep your axioms and factoids.
Good for about six hours
Only a tiny percentage of what a given person “knows” is in this first-hand, embodied form. The rest is made of impressions gathered from anecdotes, newspapers, books, schoolteachers, blogs, and things our older siblings told us when we were little.
If you ever read an article on a subject with which you have a lot of first-hand experience, you’ll notice that they always get major things wrong – basic facts, dates, names of people and organizations, the stated intentions of involved parties, the reasons a thing is happening – things even a novice in the space would know better about.
It makes perfect sense, if you think about it, that reporting is so reliably unreliable. Why do we expect reporters to learn about a suddenly newsworthy situation, gather information about it under deadline, then confidently explain the subject to the rest of the nation after having known about it for all of a week? People form their entire worldviews out of this stuff.
Me forming my worldview
What doesn’t make sense is that we immediately become credulous again as soon as the subject matter changes back to a topic on which we don’t have first-hand experience. You know they don’t know what they hell they’re talking about on Subject A, but hey what’s this about Subject B? In 2002, author Michael Crichton named this the “Gell-Mann Amnesia effect”:
Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In [Murray Gell-Mann’s] case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the “wet streets cause rain” stories. Paper’s full of them. In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.
Crichton clarifies in the full speech that by “media” he’s not only talking about newspapers, but books, television, and internet too, and of course anybody’s recounting of what these sources say. Well, that accounts for just about 100% of our second-hand knowledge.
“The situation is developing”
People do know things though. We have airplanes and phones and spaceships. Clearly somebody knows something. Human beings can be reliable sources of knowledge, but only about small slivers of the whole of what’s going on. They know things because they deal with their sliver every day, and they’re personally invested in how well they know their sliver, which gives them constant feedback on the quality of their beliefs.
Plumbing knowledge, for example, is constantly tested by whether the place floods after you’ve advanced your theory about what pipe connects to what. You need to get it right because it costs you something when you get it wrong.
The mechanic has seen a thousand check-engine lights and knows how each of them was resolved. The English professor has seen a thousand essays and can tell you what’s wrong with yours. The night club bouncer has dealt with a thousand drunk patrons and knows which guests will be trouble even before they do.
Somebody’s sliver, thankfully
There are ways to carefully gather, scrutinize, and compare high-quality second-hand sources, and maybe learn something reliable, but this is extremely difficult for the groupish, emotional creature we are. It is viscerally unpleasant (not to mention time-consuming) to honestly question beliefs you feel positively towards, or honestly entertain ones you don’t, and ultimately you’re just determining what “feels right” anyway.
Aside from our own respective slivers of reliable knowledge, we mostly carry a lot of untested beliefs — teetering piles of them, accumulated over years, from random people assuring us “this is how it is.” Most of these beliefs are bunk, but we don’t know which ones.
Beliefs are Mostly Mind-Candy
Humans love beliefs, not because they’re reliable pointers to what’s true, but because they often feel good in some way, or have social rewards. Expressing and sharing beliefs can get us attention and social status, make us feel competent, sell our goods and services, and motivate people to do things for us, and they can just feel satisfying to say aloud. A convincing belief is simply one that feels good to the ears, or the mind.
Beliefs of mine, awaiting testing
Theory feels good. Pithiness and analogy feel good. A tight sentence feels good. Neat and snappy stories about what’s “true” are like candy to the sense-craving part of the human brain.
Notice how many smart people believe things like, “You can’t reason yourself out of a belief you didn’t reason yourself into.” This is a belief nobody would arrive at through reason. It doesn’t stand up to even a minute’s logical scrutiny. You certainly didn’t reason yourself into a belief that North-Pole-dwelling elves made your childhood toys, but you probably reasoned yourself out of it. Both beliefs are just mind-candy, only for different audiences.
In short, human beings are bad at gathering information, inferring the right things from it, and responsibly passing it on to others. It is incredible what we’ve achieved in spite of this — almost entirely by carefully combining and testing our respective reliable slivers — but as a species we remain supremely untalented at knowing what’s true outside the range of our senses.
A relevant documentary (1996)
Much of the problem is that we want so badly to be believed, to be seen as someone who knows stuff. In the rest of Crichton’s speech, he explains why he named the Gell-Mann Amnesia effect after Murray Gell-Mann: because he’s famous, and he’s a physicist. People believe things named after physicists because we know they’re smart. And Crichton is a medical doctor, so you should listen to the guy.
Also, none of you will be able to confirm this, but George Orwell did not say the line in the intro of this article. I just said that he did so you would take my own assertions more seriously. And it’s too late — you’ve already tasted the candy. I mean, Orwell could have said something like that. He might as well have said it. He probably did, basically! I will sleep soundly anyway. There are few penalties for bullshit, and many rewards. Because nobody knows what’s going on.
***
Images by Artem Beliaikin, The Onion, Jamie Street, Rivage, Ryan Brooklyn, Gaurav Bagdi