National Opinions

OPINION: Too much misinformation? The issue is demand, not supply.

With the US presidential election a little more than a year away, candidates and voters are bracing themselves for an “explosion” of AI-generated misinformation. Adding to the fear is that many research programs intended to study and counter misinformation, facing accusations of bias, are shutting down.

Given all this, I have a prediction: AI-generated misinformation will not be a major problem in the 2024 campaign. But that’s only because so many other forms of misinformation are already so rife.

Speaking in economic terms, the problem with misinformation is demand, not supply. Consider, for example, the view that the 2020 election was stolen from former President Donald Trump. To explain what happened in simple terms, there was a demand for this misinformation, namely from some aggrieved Trump supporters, and there was also a supply, most prominently from Trump himself. Supply met demand, the issue was focal and visceral, and the misinformation has continued to this day.

No one needed an AI-generated fake video of state officials fabricating ballots (and indeed, quality videos of that kind were not then possible). Even simpler technologies, such as photo manipulation, were not driving the fake news. Rather, the critical element was that many Trump supporters wanted to believe that their candidate had been wronged, and so Trump provided a narrative of victimization. Unfortunately, no proof or even pseudo-proof was required — and objective evidence against Trump has not broken his support.

In other words: Misinformation is, in many cases, a fundamentally low-tech product.

Or consider the story that former President Barack Obama was not born in the US. It did not take off because someone forged a copy of an Indonesian birth certificate. Instead, many people approached the issue wanting to believe that Obama was not “a real American,” some dangerous tidbits were thrown their way, and off they went. The release of Obama’s US birth certificate did not convince them they were wrong.

Lies, misunderstandings, instances of self-deception: They have long been in excess supply. Blame China, Russia, social media, regular media, whomever. A potentially gullible person is already flooded with more lies in a single day than he or she can possibly evaluate.

ADVERTISEMENT

A greater number of falsehoods just won’t matter that much — because the scarce resources are attention and focality on the demand side. How much is someone looking to believe they have been wronged? How much do they resent “the establishment”? What kinds of grudges do they hold, and against whom or what? And how well can they coordinate with others of like mind, thereby forming a kind of misinformation affinity group?

AI should not be expected to worsen those problems, at least not through any obvious, first-order effects (obviously, any major social change will have diverse ramifications through a wide variety of channels). If anything, large language models might give people the chance to ask for relatively objective answers.

It is also instructive to look at episodes of “misinformation” that may not have been misinformation at all. The COVID-19 lab-leak hypothesis initially was kept off mainstream social media, but it is now seriously debated and might even be true. It stayed alive in part because the supply side of misinformation was so plentiful. Many advocates of the hypothesis were honest truth-seekers, but there were also many scurrilous troublemakers. They served a useful function in this case, much as short sellers do in the market - even when their motives are not pure.

So what do we have for potential solutions? Fact-checking is neither financially sustainable nor journalistically nimble enough. “Education” is frequently proposed as a remedy, but often it is the more educated who articulate, spread and track conspiracy theories. The uneducated tend to be baffled by propaganda rather than persuaded by it.

The only long-term solution is transparent governance that solves some critical problems of the day, thereby boosting social trust. After winning World War II, for instance, the US government became more popular and more trusted, at least for a couple decades. Good governance today might be more controversial, and might not yield results for a while, but it is probably the best option. A more functional world — whether that’s meant in economic or political terms — is probably a more trusting world.

Unfortunately, there is no simple way to combat misinformation. AI will add to the problem, but it is unlikely to make it significantly worse. The demand side is what matters. Trust is hard to build, but societies that have it will enjoy a significant comparative advantage.

Tyler Cowen is a Bloomberg Opinion columnist, a professor of economics at George Mason University and host of the Marginal Revolution blog.

The views expressed here are the writer’s and are not necessarily endorsed by the Anchorage Daily News, which welcomes a broad range of viewpoints. To submit a piece for consideration, email commentary(at)adn.com. Send submissions shorter than 200 words to letters@adn.com or click here to submit via any web browser. Read our full guidelines for letters and commentaries here.

ADVERTISEMENT