Home » Technology » Reading beyond the headlines is rare for most on social media, study finds

Reading beyond the headlines is rare for most on social media, study finds

FRIDAY, Nov. 22, 2024 (HealthDay News) — Three out of four times, your Facebook friends don’t read beyond the headline when they share a link to political content.

Experts say that’s surprising, and downright terrifying.

People who share without clicking may be unintentionally aiding hostile adversaries with the goal of sowing seeds of division and mistrust, warned S. Shyam Sundar, professor of media effects at Pennsylvania State University.

“Surface processing of headlines and advertisements can be dangerous if false data is shared and not investigated,” said Sundar, corresponding author of the new study, which was published in the Nov. 19 issue of the journal Nature Human Behavior.

“Misinformation or disinformation campaigns aim to sow the seeds of doubt or dissent in a democracy, and the extent of these efforts came to light in the 2016 and 2020 elections,” he added in a press release from the Pennsylvania State University.

To learn more about content shared on social media, her team analyzed more than 35 million public posts containing links shared on Facebook between 2017 and 2020. The links included political content from both ends of the spectrum, and were shared without doing so. clicked more frequently than politically neutral content.

Although the study was limited to Facebook, the researchers said their findings are likely to apply to other social media platforms as well.

Data for the analysis was provided in collaboration with Facebook’s parent company, Meta.

It included user demographics and behaviors, including a “political page affinity score.” This was determined by identifying the pages that users follow.

Users were divided into one of five groups: very liberal, liberal, neutral, conservative, and very conservative.

The researchers then used AI to find and classify political terms in the linked content, scoring the content on that same scale, based on the number of shares from each affinity group.

One by one, the researchers manually classified 8,000 links, identifying the content as political or non-political. That data trained an algorithm that analyzed 35 million links that were shared more than 100 times by Facebook users in the United States.

From this analysis, a pattern emerged that was maintained at the individual level.

“The closer the political alignment of the content was with the user, both liberal and conservative, the more it was shared without clicks,” said study co-author Eugene Cho Snyder, an assistant professor of humanities and social sciences at the New Jersey Institute of Technology. . “They are simply forwarding things that seem on the surface to agree with their political ideology, without realizing that sometimes they may be sharing false information.”

Meta also provided data from a third-party fact-checking service, which flagged more than 2,900 links to fake content.

In total, these links were shared more than 41 million times, without being clicked, according to the study.

Of these, 77% came from conservative users and 14% from liberal users. Up to 82 percent of links to false information came from conservative news domains, the researchers found.

Sundar said social media platforms could take steps to curb clickless sharing, for example users could be required to acknowledge that they have read the content in its entirety before sharing it.

“If platforms implement a warning that the content could be false and make users recognize the dangers of doing so, that could help people think before sharing,” Sundar said.

However, it would not stop intentional disinformation campaigns, he added.

“The reason this happens may be because people are just bombarded with information and don’t stop to think about it,” Sundar said. “Hopefully, people will learn from our study and become more media literate, digitally savvy, and ultimately more aware of what they are sharing.”

More information

For more details, see the American Psychological Association for more information on misinformation and disinformation.

SOURCE: Penn State, press release, November 20, 2024

⁣ **According to Dr. Sundar’s research, what⁤ percentage of Facebook users share political links without ‍clicking on them, and what‍ does ⁢this suggest about ⁣how​ people consume political information online?**

## World-Today News Interview: The Peril of Clickless Sharing

**Introduction:**

Welcome to World-Today ⁤News. Today, we’ll be discussing a recent study published in Nature Human Behavior which sheds light on the alarming trend ⁣of “clickless sharing” on social media –⁣ particularly regarding political content. Joining us today ‍are Dr. S. Shyam Sundar, Professor of Media Effects at Pennsylvania State University and lead author of the study, and⁣ Ms. Emily ‌Jones, a ​digital ‌literacy expert and social media consultant.

**Section 1: Understanding the Phenomenon**

* **Dr. Sundar, your study ⁢reveals that a startling 75% of Facebook ​users share political links without even ⁤clicking on them.⁢ What were the most striking findings of your research?**

*​ **Ms. Jones, from a digital literacy perspective, what makes this “clickless sharing” behavior so concerning?**

* **What are the potential consequences for both individuals and ⁣society as a whole when‍ political information is disseminated this way?**

**Section 2: The Role‌ of Political Affinity ⁤**

* **Dr. Sundar, your research indicates⁣ that people are more likely to share content that aligns with their ⁢political beliefs without clicking. Why do you think this⁢ is the case?**

* **Ms. Jones, how can users develop ⁤a more critical approach to information online, particularly when​ it reinforces⁤ their existing viewpoints? What ⁣strategies can be used to combat confirmation bias?**

**Section 3: The Spread of ⁣Misinformation**

* **Dr. ⁢Sundar, ⁢your study found that‍ a significant portion of clicklessly shared links led to demonstrably false information. What are the implications for the spread of misinformation in the digital age?**

* **Ms. ‌Jones, how can social media platforms be more proactive in tackling the problem of misinformation? Are there any specific measures they could implement to discourage clickless sharing of potentially harmful content?**

* **What role can users play in identifying ⁤and reporting misinformation? What resources are available ​to help people verify the accuracy of information online?**

**Section 4:​ Fostering Responsible Online Engagement**

* **Dr. Sundar, what advice would you give to ‌individuals who want to be more ‌discerning ‍consumers and ‌sharers of ​online content?**

* **Ms. Jones, what are some practical​ steps that educators and⁣ policymakers can take to promote digital literacy and critical thinking skills among⁤ younger generations?**

* ‍**Looking⁤ ahead, what are your hopes for⁤ the ⁢future of online discourse? How can we create a more informed and responsible⁤ digital environment?**

**Conclusion:**

Thank ‌you both for sharing‍ your valuable insights on this‌ crucial issue. We hope this discussion has shed light on the dangers of clickless sharing​ and encouraged ‍viewers to engage with online⁤ information more critically and responsibly. For more information ⁣on the ​study and digital ⁢literacy resources, please visit our website at world-today-news.com.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.