Issue No. 16
An epiphany is a moment of sudden revelation. An apophany, or apophenia, refer to moments of sudden deluded revelation, caused by creating patterns out of unrelated things and then believing those patterns to be true. This issue is about how that tendency can impact our thinking in everyday life.
The other day I drove to a cafe on the other side of town to meet up with a friend. I parked the car, went in, we ordered. Then right before going for a walk, I spotted a man sitting at a bench eating his sandwich, wearing a high-vis vest with MTA written on it. MTA being the city’s transportation agency, which includes parking enforcement.
I turn to my friend and say, “Man, these parking attendants are out to get me. It doesn’t matter where in the city I am, how hard I try to read posted signs, they always find me and they always find a reason to ticket me.”
And as I was saying that, half-flippantly, I realized I’d fallen for a tendency I’d once read about, where we take two unrelated events, like me getting a parking ticket a month ago, and me now seeing an MTA person eating a sandwich at a park bench, and connecting the events into some statement of reality. Conjecture, allowed to fester, turns into delusion—someone innocently eating a sandwich is taken for a heartless sadist.
What causes that sort of thinking?
Biology. Our brains seek patterns. A “natural and necessary consequence of our evolutionary history”.1 As a result of that inclination, and the constant scanning for patterns, our brains can be prone to lumping together untrue patterns they come across with true ones. One study on the topic concludes, “natural selection will favour strategies that make many incorrect causal associations in order to establish those that are essential for survival and reproduction.”2
An extreme presentation of that feature is found in a mental disorder like schizophrenia. The word apophenia, in fact, comes from the writings of a 20th century neurologist, who’d spent time investigating the early stages of the disorder,3 and in doing so, defined an initial “stage fright” stage, where a patient feels that something in the air isn’t quite right, followed by an “aha” revelatory stage, where a patient rearranges the world around them to fit their belief that every small thing is a covert reference to them.
Anxiety, pressure. When we’re anxious, stressed, or under pressure, we might switch from rational thinking to emotional thinking. That in turn makes us vulnerable to cognitive distortions like mind-reading (assuming we know people’s intents and motivations) and confirmation bias (looking for data that confirms our pre-existing beliefs). An anxious person is constantly on the lookout for threats, and so every event in the world becomes salient, and a candidate for attention and analysis.
This heightened sensitivity affects our ability to create meaningful, evidence-based links between things. It can impact a scientist, under pressure to publish findings, so they do so the moment a pattern emerges that confirms a hunch. (That, rather than repeating the experiment on other datasets for instance.)4 It can impact an employee, who has an opportunity for advancement hanging on being able to show that work done earlier in the quarter led to some favorable outcome.
The need to simplify information. We all have subconscious tendencies that affect our decision-making. We’ve been calling them cognitive biases. One of those is the need to simplify information, for instance, by connecting independent events on account of them happening together or one after the other. Two such tendencies we haven’t covered so far are the gambler’s fallacy (if you roll a six, you’re less likely to roll a six again) and the hot hand (if you made the right call last time, you’re likely to make the right call this time).
Culture. A cultural norm that says everything in the universe happens for a reason will produce the type of person who has greater cause to look for explanations behind things. Coincidences don’t exist in this worldview, things are either predestined or have a higher purpose that will become apparent in good time. An effective tool for coping with feelings of fear and loss, though on the flip-side, it opens one up to a wider gamut of explanations they’d need to vet.
Where might we see apophenia in everyday life?
Basing decisions on insignificant events. You’re stuck on a question during an exam, you look out the window and see a cloud up above that kind of looks like a C. The universe has given you a sign you conclude. You write down C. Or you’re deciding whether to accept a job offer or make some life-altering decision during your morning walk, and then you spot a crow. You conclude it’s a bad omen.
Assuming everything has an ulterior motive. Thinking, prima facie, that every new legislation that’s passed, every new policy that’s rolled out, every new technology that’s developed, has some hidden agenda behind it that is being kept from the public.
Trusting opaque or non-repeatable studies. A study makes some claim, but doesn’t stand up to scrutiny when we try to reproduce it or when we look at its source data. Or some post or video online connects various things into some theory about how to be productive, or live a longer life, or cure an incurable disease, but doesn’t offer any objective evidence when you look deeper.
Seeing cryptic meanings in books, movies. A writer publishes a story, and readers report on all kinds of cryptic themes in it. When Samuel Beckett was asked about the hidden meanings in his play Waiting for Godot, he responded, “Why people have to complicate a thing so simple I can’t make out.”
Determining that short-term stock market fluctuations are predictable. A market manager promises some return on the basis of some unique knowledge of the stock market, when in reality, stock markets in the short-term are irrational. Ups and downs might reveal some semblance of a pattern to us, on the basis of which we might decide what and when to buy and sell, but trends are typical of only longish time horizons.
What can apophenia lead to?
Bad conclusions. We create models of the world and express them through language, through images, through equations. A bad pattern is a bad model. And bad models can lead us to bad conclusions. Or they can really veer us off the path and lead to conspiratorial thinking. For instance, we really want that silhouette under those trees to be Bigfoot, or the silhouette emerging from that lake to be the Loch Ness monster, or that disc-shaped flying thing to be aliens, so we start connecting bits of circumstantial evidence from here and there to make them so.
Harm. Bigfoot and Nessy are harmless beliefs. As are superstitions like knocking on wood or wearing a certain bracelet to bring us good luck. Other ones have more damaging consequences. If I believe an emergency vaccine that has gone through clinical trials is a conspiracy to experiment on or control the masses, that can lead to actual deaths. If I believe there’s a group intent on taking over a continent, that can lead to innocent people being scapegoated.
An anti-pattern that persists. Revelatory explanations of the world can be appealing, addictive even. Once a person becomes amenable to one contrarian explanation of something without evidence, they become amenable to others just like it. And that’s extraordinarily damaging, because bad instincts last a lifetime. Worst still, we pass them onto our children who don’t know any better. Just like with genes, except we have control over our patterns of thinking.
How might it be weaponized?
To manufacture mistruths. Hand-in-hand with knowing how something works is knowing how it can be exploited. Knowledge about our susceptibility to believing untrue patterns means that anyone who’s in the business of influencing public opinion, for personal or political gain, can do so with some ill will and some tenacity.
The 24-hour news cycle, the unending deluge of content shared on the web, the consolidation of media outlets into the hands of a few corporations, the rise of individuals with outsized followings, means that all it takes for a delusion to spread—to trend—is for enough people to repeat it. Even with the best will, we’re constantly susceptible to manufactured, untrue patterns about the world. As the study referenced earlier noted, our brains process patterns in realtime and they don’t always do a good job of sifting out the untrue ones.
To shut down the truth. When a lone voice speaks up, and says things aren’t as they seem, that can rattle anyone benefiting from the status quo. And so the lone voice is dismissed as radical, as mad, as a proxy for some bogeyman. A whistleblower acting for the public good might be dismissed as a traitor, an employee pointing out harassment at work might be dismissed as bitter, a scientist saying planets actually circle the sun might be dismissed as a heretic, or one saying that stomach ulcers are caused by a certain bacteria might be dismissed as mad.5
What’s daunting in these examples is the realization that a person who goes against the grain, who has a hunch about a clandestine pattern of connected things having, in fact, a basis in reality—an epiphany in waiting—might encounter serious resistance. They might be gaslit, emotionally manipulated into believing they got it wrong, or they remembered wrong, or they concluded wrong. That they’re deluding themselves. A person is therefore made to believe that they’re psychotic in an effort to subdue and quell them.
I was reminded of how painful that experience can be during an episode of a TV show from the ‘00s. A mom of four gets a ticket for carelessness while driving. She insists she’s being framed. That the police officer is out to get her. Eventually, a tape from a security camera shows she was in the wrong. The case appears to be settled. We have a pattern of corroborating events, which include eye witness testimony, footage, and a consensus view of those closest to her. Her husband convinces her to admit fault. What more can be said—the tape is conclusive. So she swallows her pride. Her soul is crushed.
Then near the end of the episode, another security tape emerges, showing a different angle. Mom wasn’t wrong after all.
How do you counter apophenia?
Use rigorous methods for determining causality and correlation. Look into studies to ensure they’re well designed. Look at source data when presented with a summary report of findings, especially one that has significant societal or policy implications.
Assume disorder until proven otherwise. All things trend toward disorder. The onus is on the person making a claim about some meaningful pattern to show you evidence for that claim. Adopt a mindset that questions first instead of taking things as read.
Don’t jump to conclusions, tempting though it may be. Stay in the realm of doubt for as long as is necessary until you’ve gathered enough data to move on. It keeps one humble and it ensures we’ve considered all factors before taking a position.
And realize that apophenia can be weaponized by a partial actor, to manipulate us or to shut down truth-telling. If you suspect that to be the case, be extra vigilant about focusing on the facts and not on how sweet the rhetoric you’re hearing is, or how authoritative the person or the institution bearing the narrative is. Emotional manipulation relies on vulnerability on our side. And impartiality, conviction, plus sometimes the passage of enough time, disarm that sort of manipulation.
(Office Space, 1999, R.)
Final thoughts
I read a story many years ago called Symbols & Signs by Nabokov.6 Melancholic, and so gentle in its portrayal of two elderly parents. Trying and failing to visit their son at a sanatorium on his birthday. He’s there because of referential mania, they’ve been told. Convinced that everything that happens around him is some secret reference to him.
The parents, en route, experience, as it happens, a number of events that seem banal and could befall any city dweller—a train breaks down, a bus is delayed, an unexpected downpour. Then closer to the sanatorium, they spot a bird, helplessly twitching in a puddle. In the event, they’re not able to see their son that night, and the story ends with a phone call, possibly from the sanatorium, bearing unpleasant news. We don’t know.
One of my realizations from that story at the time, and still today, is that it’s always easier to ascribe meanings to events in hindsight. If it turns out the boy died, then the dying bird in the puddle will be seen as a premonition. If it turns out he’s badly injured, then the twitching was the sign. If it turns out he’s perfectly fine, then we might forget the bird altogether. It’s intriguing how we create narratives of our past, and pull in signs and symbols to fit some arc.
What’s crucial, I feel, is how we use symbols and signs—patterns—that are actually significant to inform our decisions in the present. And that’s a fitting idea to end this issue on. To stay focused on the present, and to forge through it with the right guardrails for thinking in place.
Birth is not one act; it is a process. The aim of life is to be fully born, though its tragedy is that most of us die before we are thus born. To live is to be born every minute. Death occurs when birth stops.
Until next time,
—Ali
P.S. Interested in sponsoring an issue?
P.P.S. A very warm welcome to the 147 new subscribers who’ve joined us since last time. Welcome!
https://psyche.co/ideas/when-the-human-tendency-to-detect-patterns-goes-too-far
https://royalsocietypublishing.org/doi/10.1098/rspb.2008.0981#d3e315
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2800156
A study that looked at 300 papers in notable outlets found that over a third of their published findings wouldn’t hold if their tests were repeated. https://www.researchgate.net/publication/282844919_Scientific_apophenia_in_strategic_management_research_Significance_tests_mistaken_inference
https://journalofethics.ama-assn.org/article/barry-marshall-md-and-robin-warren-md/2000-04
https://www.newyorker.com/magazine/1948/05/15/symbols-and-signs
And right on cue, a letter in the mail from SFMTA this morning. (They’re out to get me, I tell you.)