ike most Americans, I consider disinformation to be a serious threat to our democracy and economy. Efforts to intentionally mislead by polluting the overall information environment undermine the capacity of individuals and organizations to engage in informed decision-making.
Ninety-one percent of Americans say fabricated news causes confusion about basic facts (Pew Research Center), and more than half of those surveyed view misinformation and disinformation as a “major problem” (Institute for Public Relations). The root causes of disinformation are extremely complex, but – and here’s some good news – the tools to slow its spread and mitigate it may be more simple than many of us might think. I believe that we each play a critical role as a stopgap to efforts and materials designed to divide.
Unlike pollution in our physical environment, which typically originates from a small number of bad actors, all of us likely bear some responsibility for polluting the information environment. It’s important to realize that with a simple click and share, any of us can become polluters. As an example of this, an MIT study indicates that false news stories are 70 percent more likely to be re-tweeted than true stories, and research conducted by McMaster University shows that repetition of misinformation increases perceptions of its accuracy.
While the media spotlight on disinformation intensifies, segments of society remain especially vulnerable. A survey of 21,196 people in all 50 states and the District of Columbia, conducted by researchers from Harvard University, Rutgers University, Northeastern University and Northwestern University, determined that those in the 18-to-24 age range had an 18 percent likelihood of believing a false claim, as compared with only 9 percent for those over 65.
As the parent of three young adults, age 18-23, I regularly hear false information about COVID-19 vaccines repeated by them and their peers. People of this age are not only among the most susceptible to disinformation, they’re also among the groups most likely to take a “wait and see” approach to being vaccinated. In fact, fully 25 percent are undecided about getting the vaccine.
Regarding vaccination, one of the most toxic falsehoods, which, for the record, already has been thoroughly rebuked by the medical community, is that it can lead to infertility. This claim originated on 4chan, the anonymous, anything-goes online forum, and has since been amplified by fringe trolls and conspiracy theorists. And according to data from the Austin-based AI software company Yonder, which discovers the hidden groups controlling and spreading online narratives, foreign factions have now weaponized this “link” between vaccination and infertility. The information environment becomes polluted when falsehoods leap from the fringes to the mainstream, thanks to a host of likes, forwards and shares.
A lengthy, complex list of behavioral, political, economic, communication and media issues impact the formation and proliferation of disinformation. When information “tastes good” and is repeated with sufficient frequency, some will believe it – regardless of whether it’s accurate. In many instances, according to a recent study, people don’t share false information based on politics or pleasure. They do it because they don’t stop to think about what they are reading and sharing.
As someone who works in the public relations profession, I understand that I have a responsibility to help facilitate and create a healthier information environment by supporting the dissemination of truthful, honest and accurate information in accordance with the Public Relations Society of America (PRSA) Code of Ethics. In this capacity, I, and my fellow public relations practitioners, can help the public at large become more educated around this topic and help to create information advocates. By activating and committing to a few simple steps, consumers of news and information can act to weaken the influence of disinformation. Studies indicate that if every American simply paused before sharing information or, better still, took the time to verify it, we would significantly curtail the spread of inaccuracies and falsehoods. It may be difficult to believe a simple nudge about pausing could actually alter the information landscape, but that’s exactly what multiple studies indicate.
Just as we take annual precautions against influenza, we can inoculate ourselves against misinformation. And in so doing, we build up our mental and media muscles. My professional association, PRSA, is helping connect news consumers with information, including a free media literacy program and various free simulations and games, which have proved effective at building disinformation immunity. Consider a study from Cambridge University, which showed that within a control group, playing just one session of the game Bad News reduced the perceived reliability of fake news by an average of 21 percent. These tools are accessible at https://voices4everyone.prsa.org/.
When deception goes unchecked, civil discourse – including voices that are diverse, discerning and challenging – becomes muted and the marketplace of ideas is compromised. The true danger of disinformation isn’t a decrease in factual accuracy. Rather, it’s that people can be deceived into acting on a falsehood and in some case confused into a state of indecision and apathy.
While we may not be able to stop bad actors from seeding the environment with falsehoods, data shows we can impact the scattering of disinformation. With the help of a simple pause, a media literacy course, and a five-minute game, we can make better decisions about communicating and begin altering the information environment. We can each make a difference in altering the disinformation landscape. I challenge each of you to commit to these three simple steps and become an information advocate.