We live in an era of distrust. People of all backgrounds and political affiliations are increasingly distrusting the individuals, the platforms, and even the news media companies that present them with information. And on top of that, we’re learning about malicious actors intentionally trying to manipulate the public with bad information.
We entered into the 2000s believing we were building the information age, but it’s turned into something closer to the “misinformation age.” Is the truth dead? What’s responsible for this problem, and how can we move forward with a reliable sense of what’s true?
Information accessibility seemed like it would immediately democratize news (and information in general). And to be sure, this has inspired a wave of transparency across some industries and organizations. However, this isn’t exclusively a “good” thing for truth.
For starters, accessibility works both ways. Thanks to the internet, anyone can search for anything they want—meaning they can conduct their own research. But it also means that anyone can create any kind of content they want—which means they can write a blog post about how all U.S. presidents have secretly been cyborgs and nobody can stop them.
The combination of these effects means it’s possible for anyone to find almost any information. With confirmation bias and information accessibility in play, you can find a source that backs up your beliefs — no matter how wildly untrue they are.
The plethora of information bracketed in bias, delivered directly to your favored sites — leads to the effects of echo chambers. An echo chamber is any kind of community or platform where people tend to share the same opinions, limiting the influence of novel or competing thoughts.
For example, if you frequently post on a forum about how great the Star Wars prequels were and you ban anyone who says they were awful, you’ll find yourself in a self-sustained community that only believes the Star Wars prequels were good.
Echo chambers have devastating effects, especially on a large scale. For starters, people in an echo chamber don’t know they’re in an echo chamber, and they’re inclined to double down on their beliefs — regardless of how much they do or do not reflect reality. On top of that, echo chambers have a tendency to ridicule and exaggerate the beliefs of others, resulting in distorted thinking about whatever the “other side” is.
If you personally subscribe to Star Wars prequel quality, this is no big deal. The danger comes when the echo chamber rings in someone’s head as “true” and brings about “a causation.” Causation is the action and there is always a consequence to the action. What is the consequence of those doubting the validity of a pandemic or questioning the legitimacy of a political candidate? People can be inspired to take drastic—and sometimes violent—action.
Echo chambers don’t distort the truth; they just serve as a kind of filter that eventually leads to distorted thinking. They make it harder to find the truth, or be sure that what you’re reading is the truth.
We also need to look at the complicated interactions between human attention spans, the news cycle, and the current state of news media overall. Over time, the news model has changed. The rollout of 24-hour news broadcasts, early in the internet’s development, incentivized media companies to release new information as quickly as possible; in other words, they began to favor fast dispersal over inherent accuracy.
These days, the effects are even more severe. The majority of people who get their news via social media don’t read full articles; instead, they read headlines and assume they know the rest of the context. This is part of what makes “fake news” so powerful; despite being riddled with obvious inaccuracies in the body of an article, a sufficiently emotion-provoking headline can rouse the masses.
Incentives for journalists and newspapers aren’t the only problem with news media in the modern world. We also need to look at the incentive structures currently in place for both journalists and newspapers. The old news model relied on paying subscribers; if your newspaper had a good reputation, people would subscribe to it, and you’d be able to make money.
These days, if you want to succeed, you need to provide your content for free and make money with advertising. The only way to get advertising dollars is to get clicks to your website. And the only real way to get clicks to your website is to be sensational; you have to release new content faster than your competitors, evoke strong emotions, and inspire curiosity. And none of these goals has anything to do with providing the truth.
That’s not to say there aren’t good journalists or news media companies out there. But the best ones at finding the truth end up getting far less attention and less money under this model.
We tend to rely on trustworthy websites as our barometer for truth. If we read something wild on the internet, we turn to a source we know and trust to see if what we read is accurate. But who’s checking the fact-checkers?
Fact-checking websites and “authoritative” sources are often an extension of the echo chamber effect. Even some of our most trusted institutions and organizations have subtly become biased over the years or have resorted to providing mere snippets of information rather than the full context of a situation. People use these sources not to research and discover more, but to quickly prove themselves right in the middle of an argument with someone else.
Everything is made even more complicated by advancements in technology that make it easier for people to distort the truth—or fabricate a new kind of truth entirely. New kinds of content are very exciting from an entertainment perspective, but terrifying from a truth management perspective.
Take Deepfakes as an example here. With Deepfake technology, a person can replicate the face, expressiveness, and voice of a famous person, inserting them into previously existing footage. At its most innocent, this is a great way to see how an actor like Tom Holland might have looked in an older movie like Back to the Future. But it could also be a way to provide video “evidence” that a politician said something—even if they never truly said it.
There are more incentives to lie than there are incentives to tell the truth. Lying is more flexible, because there are more falsehoods than accurate ways to describe the recent past or current events. Lying allows you to persuade people in a way that moves closer to your agenda, whatever it happens to be. Lying is also easier, because you never have to do any research to back up your claims.
Countless actors, from media companies to foreign states, are willing to lie to achieve their goals—even if those goals are altruistic. User manipulation is just too powerful and too accessible to ignore.
Leadership can be highly influential. When a small handful of individuals and companies benefit from lying, manipulating the public, and segregating us into echo chambers, others are inclined to follow. This has created a kind of snowball effect in which more and more people are willing to bend or break the truth to suit their needs.
At a certain point, we need to ask ourselves how important the truth really is. On an individual level, the importance of truth is obvious; it leads to trust, helps us build bonds, and helps us access the things we need. But on a social level, does it really matter if one isolated group on the internet believes in a crazy conspiracy theory?
There are philosophical arguments to be made that the truth is subjective, that it can never be known, or that believing an untruth isn’t inherently bad. However, it does seem like for our society to keep moving forward, we need some reliable way to evaluate the truth—if we can’t discover and distribute it ourselves.
So what can we do about this? The power and impact of Deepfakes, fake news, and other tools of manipulation are so complex, so multifaceted, and so widespread that no single law could resolve them. Instead, we’ll have to rely on changing human perspectives and behaviors—and that’s something that tends to happen gradually, over time.
The age of misinformation is in full swing, and it’s likely to get worse before it gets better. But in time, we may rediscover the values most necessary for us to trust each other and collaborate.
Image Credit: aleksandar pasaric; pexels