Vaccinate against misinformation superspreader

in Features by
2020 brought to light more than just a global pandemic: it also highlighted the manipulation of information currently plaguing the media. (Graphic/Georgetown University)

This past year, a pandemic ravaged the world. Events, and people, became COVID superspreaders, transmitting the virus rapidly. However, there’s another disease plaguing the 21st century: opinion-based misinformation.

On social media this year, infographics centered on the pandemic, election and social justice scattered across many users’ feed. While the aesthetic of the graphic may appeal to viewers, and make the information seem more believable, a Harvard study finds that people who consume news from social media are less likely to believe facts regarding COVID. 

To try to combat this, social media sites began to label factually false information as such. However, an MIT study presents an underlying problem associated with the terms “fake news” and “truth.” MIT Sloan Professor David Rand and co-author Gordon Pennycook investigated how this label alters perception. 

The pair found something they call the “implied truth effect,” in which people assume all information without a label is true. If headlines featuring faulty information don’t contain a misinformation warning, consumers automatically assume it’s true. 

“Fake news” and misinformation peaks during election years, according to MIT Sloan. On Twitter, falsehoods are 70% more likely to be retweeted than the truth. However, Statista finds that only 26% of Americans are very confident in their ability to recognize fake news, but 61% of millennials read political news on Facebook. 

What all these numbers add up to is the digital pandemic of social media superspreading misinformation. A “vaccine” does exist, though, and it begins at an individual level. 

Rand and Pennycook’s 2018 study found that “people who share false information are more likely distracted or lazy, rather than biased.” In other words, can readers “wear a mask” and help stop the spread of false information by simply researching before sharing. 

Similarly, recognize how personal bias shapes perception. Social media bots are designed to draw from your search history to display specific content that appeals to its target audience. As such, social media ads can exploit bias to further skew opinion. 

Signs of misinformation include clickbait headlines, such as “20 ways to bypass Apple Classroom (you won’t believe #19!),” and satirical or propaganda-based articles. These types of pieces employ, respectively, subtle or excessive irony and emotional appeals to encourage readers to feel a certain way, then capitalize on that with skewed statistics and half-truths. 

If you encounter misinformation, don’t keep scrolling; report it. Although the process on each platform varies, most include an easy-to-follow “report” button. Verizon offers step-by-step instructions on how to do so. 

Keep in mind that you control the content you see. The ads on your page derive from your internet history, so avoid and unfollow sources that contaminate your feed with lies.

While social media has connected the world, it has also provided a network in which a digital pandemic has arisen. Recognize the dangers of social media, and do your part in ending social media manipulation.

As a senior, Kayla Rubenstein spends her fourth (and heartbreakingly final) year on staff as Online Editor-in-Chief, Business Manager and Social Media Correspondent. Wanting to make the most of her senior year, Kayla serves as the President of Quill and Scroll, Historian of Rho Kappa and Co-Historian of NHS, while also actively participating in EHS and SNHS. Outside of school, Kayla contributes to Mensa’s publications and volunteers with different organizations within her community. An avid reader, Kayla can often be found with her nose in a book when not working on an article for The Patriot Post or developing a project for iPatriot Post.