The views expressed by contributors are their own and not the view of The Hill

Rich Uncle Pennybags didn’t wear a monocle — and other false memories


During a recent meeting with a colleague, I noticed a black top hat sitting on his bookshelf and jokingly suggested he start wearing it to greet people entering the office. When my colleague compared himself to the Monopoly man, I thought it would be hilarious to get him a monocle. I even searched for a cheap one online. But after checking images of Rich Uncle Pennybags, I learned the character in fact does not wear one. It turns out I’m not alone in my erroneous belief. The common misperception that the Monopoly man wears a monocle is but one example of the Mandela effect

The Mandela effect represents a pop culture take on collective false memory, which occurs when large groups of people misremember the same thing or remember something that never actually happened. This phenomenon has implications stretching far beyond my misguided attempt to get a gag gift for my colleague. Granted, many popular examples of the Mandela effect seem harmless —  remembering “Berenstein” Bears instead of “Berenstain,” or believing Darth Vader said “Luke, I am your father” — but when we consider collective false memory in connection with misinformation and disinformation it raises much graver concerns.

In the case of COVID-19, we’ve seen an alarming number of people consuming medications not approved for its treatment, and we continue to see additional spikes in COVID cases in part because of misinformation about vaccine safety. 

Collective false memory works in tandem with another phenomenon arising from our media-saturated environment: the illusory truth effect. When information gets repeated frequently, people believe it to be true — often regardless of the source’s credibility or evidence to the contrary. 

In a recent Ted Radio Hour segment, Dr. Jen Gunter, an OB/GYN and women’s health advocate who addresses misinformation about health and wellness, noted that social media and the 24-hour news cycle represent a “toxic combination that lets misinformation just metastasize. … We all mistake repetition for accuracy.” In other words, when people hear an idea repeatedly in an echo chamber created by their social media networks, they are even more likely to believe it. With nearly half of all American adults seeking news on social media, there are ample opportunities for illusory truths to become entrenched collective false memories.

As a rhetoric and writing studies professor, I’d like to say that I know the solution to collective false memory and illusory truth: research. In my classes, I encourage students to seek multiple forms of information before drawing conclusions about an issue. 

And certainly in our current networked environment, getting facts seems easy enough. We have no shortage of information. But ironically, information overload makes sorting and processing the millions of sources generated from a simple search especially challenging. The disturbing truth is, even when we want reliable, high-quality information, we may end up sharing misinformation. 

Worse, we tend to think we’re not doing so. According to a recent Pearson/AP-NORC poll, 95 percent of Americans believe misinformation is a problem, but only 20 percent are very or extremely worried they’ve personally spread it. 

As a first step, we should acknowledge that we are all part of the problem. After all, it takes significant energy, knowledge and time to fact-check something or investigate questionable ideas. Since we already struggle to sort through the overabundance of data we encounter on a regular basis, we are more likely to accept repeated claims as truth, especially if they align with our beliefs. It’s much easier and more emotionally satisfying to travel the same neural pathways than it is to form new ones, and research has shown our brains are not wired to take the road less traveled.

Second, if misinformation is part of our new information economy, we must account for collective false memory, the illusory truth effect, and other phenomena that make us susceptible to deceptive sources. And we should be aware that many media platforms take advantage of our susceptibility.

To combat these forces, we can be proactive and use resources such as the Bad News Game and Go Viral! to learn about how misinformation and disinformation spread. We can also be reactive and use the participatory nature of social media in positive ways. Point out misinformation when you see it and support others doing the same. If you have expertise in a particular area, share your perspective and direct people to credible sources. 

Pushing back on misinformation could mean we simply resist getting swept up by the constant flow of information. Pause, look past the snazzy graphics or catchy memes, and view news feeds with a bit of skepticism. Slow down at least long enough to ask, “How do I know this is true? Why do I want to share it? What impact will it have?” If any of these questions is difficult to answer in the moment, maybe hold off on sharing. 

The burden, however, cannot be placed solely on consumers to prevent the spread of misinformation. We must call on our policymakers to hold media outlets accountable for how information circulates on their platforms and to expand existing efforts to protect users from disinformation and manipulation. 

We might not be able to change the cognitive biases underlying our susceptibility to misinformation. But we can take control of our media habits and avoid fueling the illusory truth effect or reinforcing collective false memories. For my part, I suppose I’ll start by getting my colleague what Mr. Monopoly actually wears: a bow tie.

Dr. Jessica Lyn Bannon is an associate professor of English and director of the writing lab at the University of Indianapolis. She is a Public Voices Fellow of the OpEd Project.