Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
A recent investigation by researchers at Penn State has shed light on a disconcerting behavior among Facebook users, as they sifted through over 35 million public posts containing widely shared links from 2017 to 2020.
Astonishingly, around 75% of these shares occurred without users clicking on the links first.
This trend transcended partisan lines, with both politically charged content and neutral information being shared extensively without prior engagement.
The findings emphasize an alarming reality: political posts, regardless of their stance, are more likely to be circulated based on headlines and brief summaries rather than the full content.
The lead author of the study, S. Shyam Sundar, a distinguished professor in media effects at Penn State, expressed surprise at the results.
Contrary to his expectations that those sharing content had taken the time to read and consider it, the data painted a different picture.
The research relied on Facebook data obtained through Social Science One, a collaborative endeavor curated by Harvard University, which also provided insight into user demographics and behavior patterns.
To deepen their understanding, the researchers created a political page affinity score that categorized users along a spectrum from very liberal to very conservative, based on the pages they follow.
They also utilized machine learning techniques to assess the political nature of the shared links, focusing on the presence of political language within these materials.
The study highlighted a troubling trend: users were more likely to share political content that aligned closely with their beliefs, often without engaging with the information beforehand.
This habit raises concerns about the unintentional spread of misleading information, as people seem inclined to propagate content that resonates with their views without fully grasping its accuracy.
Alarmingly, conservative users were found to be responsible for sharing nearly 77% of links containing false information within the scope of the research.
In response to these findings, Sundar proposed that social media platforms could take proactive steps by implementing features that require users to confirm they have read the entire content before sharing.
He also suggested the introduction of alerts regarding potentially misleading information.
Moreover, he underscored the importance of people taking responsibility for discerning the quality of shared content.
The study aims to cultivate greater awareness and improve media literacy among social media users, as misinformation thrives in environments where people do not engage deeply with information.