How to Tackle Truth Decay
When then-President Donald Trump was briefed on the California wildfires in 2020, the scientific opinion he heard was that climate change was real and had contributed to the conflagrations that ended up consuming more than 4 million acres and killing 31 people. His response? “Science doesn’t know.”
Millions of Americans trusted Trump, a fact he leveraged to attack the trustworthiness of science itself. Trump’s actions are part of a larger pattern of assault on expertise. People need to trust that the experts will tell the truth, and they need to trust the connections between themselves and the experts. A division of labor that was necessary because of our complex social and technological world created the vulnerability of a possible cleavage between expert elites and a distrustful populace.
Our belief in things we cannot ourselves verify relies on trust networks. If the connections to the experts are broken, our understanding of reality becomes untethered. Society then begins a slide into doubt and denialism, and “truth decay,” as a RAND initiative has called it, starts to occur. If we want to reverse that process, we need to rebuild the networks of trust.
Half a century ago, the pioneering philosopher and mathematician Hilary Putnam observed that philosophers had long assumed that to know a word’s meaning is to know how to use it. To know red, for example, is to describe something as red when you see red; the same goes for the word denoting an object, such as pencil, or an action, such as run. But with the emergence of science, in the West from the Renaissance onward, a new class of empirically grounded concepts entered our everyday vocabularies that relies on experts to discern meanings. Putnam asked us to consider gold:
Gold is important for many reasons: it is a precious metal; it is a monetary metal; it has symbolic value (it is important to most people that the “gold” wedding ring they wear really consist of gold and not just look gold).
So how is it that we are able to use the word gold, Putnam asked, when most people actually can’t tell the difference between real and fake gold? He pointed out that society has organized a linguistic division of labor:
Consider our community as a “factory”: in this “factory” some people have the “job” of wearing gold wedding rings; other people have the “job” of selling gold wedding rings; still other people have the job of telling whether or not something is really gold.
Competent metallurgists can tell the difference between real and fake gold, so we rely on their expertise. The rest of us need to trust that the metallurgists know what they’re doing, and that we can take them at their word.
In my studies, I routinely use visual diagrams. So I would translate Putnam’s observation into this image showing the flow of information among people in a social network:
The green nodes depict the scientific experts who can reliably tell whether a yellowish metallic substance is gold or not. The gray nodes represent the rest of us. Information moves between people through the links. The meaning of gold for all of the nonexperts is grounded in the knowledge held and applied by the experts.
[Rogé Karma: What would it take to convince Americans that the economy is fine?]
Underlying the linguistic division of labor is one of expertise, and it applies to all sorts of empirical knowledge—concerning, say, the unemployment rate, the counting of electoral votes, and the number of missiles the U.S. has provided to Ukraine. Because of the scale and complexity of our world, fact-based experts such as statisticians, auditors, and inspectors play roles analogous to scientists in these situations. For Putnam’s factory of meaning and knowledge to work, the social network needs to be a trust network.
When Trump voiced skepticism about climate science, he was raising doubt about scientists’ expertise. Another way to erode trust in experts is to attack their credentials, or even the entire system of credentialing institutions such as universities. Yet another tactic is to question whether something is knowable at all. This is a method perfected by Russian propagandists and amplified by state media to sow doubt and place an event in a cloud of confusion.
When this occurs, what Putnam called the “linguistic community” informed by experts has been fractured, leaving a swath of society split off from experts.
This is the first stage in the decay of truth. The segment of society depicted on the left side of the network diagram is divided from all of the experts (green nodes). Let’s consider the case in which the experts are metallurgists who can tell the difference between real gold and fake gold. Imagine that you are one of the people on the left side of the divide. You no longer have trusted connections to metallurgists. Perhaps you have heard of them, but everyone you know feels the same about them: Don’t trust them. They don’t know what they are talking about. They act as if they can tell the difference between real and fake gold, but who knows? Even if they do know, they are probably lying to us to enrich themselves.
So now what? Is that wedding ring on your finger real gold or not? When you use the word gold, what does that word really even mean anymore? Maybe it’s real; maybe it’s not. Over time, if no one you trust helps resolve these questions, you will eventually conclude that the truth is not knowable. Over time, you and your social connections might even start to question whether a distinction between real and fake gold even exists.
Maybe there was just one kind of yellow metal all along? Who knows? Once you and your network arrive at that conclusion, the cultural significance and monetary value of gold—which is rooted in the scarcity of the real stuff—will inevitably deteriorate, assuming that fake gold is easy to obtain.
This is not just a story about gold. Any belief grounded in empirical expertise—even something as apparently simple and indisputable as the number of votes cast for each candidate in an election—becomes imperiled if those in a position to know the truth are isolated from one’s trust network.
The next stage of truth decay is that those who no longer trust the scientists and technocrats search for alternative sources of information, “truth” from outside the network of elite expertise.
The implications are alarming. Acting on beliefs disconnected from reality can lead to catastrophic failures, such as the mishandling of health crises (for example, by encouraging people to ingest or inject bleach) and the acceleration of environmental collapse. The erosion of a shared social reality breeds deep distrust. Conflict entrepreneurs gain power and wealth by deepening divides through attacks on expertise.
This fragmentation is not just an internal domestic issue; it’s a national-security vulnerability. Our geopolitical adversaries, notably Russia and China, learn that American society is easily manipulated by misinformation, and even our allies lose trust in the U.S. as a predictable and reliable partner.
A constellation of factors has brought us to this situation. Most scientists, economists, engineers, policy makers, election officials, and other experts are on the winning side of growing economic inequality. Resentment among those who do not see themselves on the winning side tends to coincide with suspicion of higher education as a bastion of progressive politics. As president, Barack Obama repeatedly argued for policy positions he favored as “smart,” connecting the authority of expertise to positions that also hinged on values judgments. During the coronavirus pandemic, public-health officials admonished people, on science-based grounds, for joining anti-lockdown protests, yet just weeks later, some experts counseled others to join social-justice protests after the murder of George Floyd.
Finally, social-media platforms have become the major distribution channel for mainstream-media content, effectively giving the upper hand to algorithmic selection of emotionally provocative content over editorial control based on quality of information. I see no clear path to reining in social media. Government intervention is fraught in our politically polarized era, and severely limited by First Amendment protections. We can’t fact-check our way out of the problems either. If a media source is distrusted, its fact-checkers will be contaminated by that same distrust. And on social-media platforms, where the cost of generating false and misleading information is approaching zero with generative AI, no feasible way exists to fact-check at the required scale.
Anecdotally, I have noticed many young people choosing to cut back on social media and many older adults restricting their news intake. I suspect that this is symptomatic of a growing awareness of something broken in our system, which signals a demand for change.
How can we slow down and begin to reverse truth decay?
Those who disseminate information should find ways to incorporate expert-vetted knowledge into their content. One promising effort in this sphere is SciLine, which has established a switchboard that connects newsrooms with scientific experts to enhance the “amount and quality of scientific evidence in news stories.” A similar approach could include social-media influencers who may otherwise unwittingly propagate scientifically unjustified claims.
Establishing connections between the trusted influencers in local communities—barbers, teachers, bar owners, factory-floor managers—and experts is a more challenging project to do at scale. But for important issues such as public health, it may be worth the effort. A case in point in recent years is the CDC’s Cut for Life program, which supports HIV awareness and AIDS prevention by building connections with such local opinion formers and providing science-grounded guidance to hair stylists and barbers. Using up-to-date digital tools, programs akin to this could be scaled efficiently.
Professional associations that represent experts in the fields of science, technology, economics, and public policy should invest in outreach that involves listening to criticism, to improve their understanding of what causes mistrust and how they can take more account of it. This will help professionals better translate their expert knowledge into practical insights that people can use, and that will in turn place greater agency in the hands of the public. (Several promising new and engaging ways to facilitate such listening have emerged: Hearken, Fora, Polis.)
News outlets that believe in transparent and rigorous journalism could extend a similar ethos to providing transparency about their processes for story selection. This would require an organization to state the values that guide its coverage and story selection, and to acknowledge that members of its audience who espouse different values might want more attention paid to other stories.
[George Packer: The media are still making the same mistake with Trump]
Governments should invest far more in citizen deliberation. One tried-and-tested approach is the citizen assembly, which brings together a representative group of people chosen by lottery to address important but vexing social issues. The group first learns about the issue from experts, then deliberates with the aim of producing policy recommendations that are approved by 70 to 80 percent of the assembly.
The beauty of this approach is that it creates a division of labor between experts who provide guidance and nonexperts who wrestle with values-based trade-offs among various policy options—a perfect way of restoring Putnam’s factory for producing trusted knowledge. In Ireland, citizen assemblies have led to a series of constitutional amendments on a range of complex and divisive issues, including the legalization of abortion. The United States lags behind other countries in adopting this model of consultation; that will take serious investment at local, state, and federal levels, coupled with a major media campaign to build awareness of the program. Citizen deliberation can benefit, too, from technological tools to make its initiatives widely accessible.
The more our social connections keep stratifying and fragmenting—separating experts from nonexperts—the more frail our networks of trust will become. Much of the disconnect and resentment come from the feeling among a large segment of the population that the experts are condescending toward them, issuing policies and opinions that show no respect for, or understanding of, their day-to-day lives. Experts should listen to the stories of everyday people. Everyday people should see their experiences and perspectives accounted for in the way that expertise affects their life. No one measure can reverse that process, but doing nothing about it guarantees that truth decay will get worse.
Leave a Reply