Tiny number of ‘supersharers’ spread the vast majority of fake news
Less than 1% of Twitter users posted 80% of misinformation about the 2020 U.S. presidential election
30 MAY 2024 BY KAI KUPFERSCHMIDT, SCIENCE NEWS
About 7% of all political news shared about the 2020 presidential election by a sample of 660,000 U.S. users of X (formerly Twitter), came from fake news websites.SPENCER PLATT
Did you see the article claiming Kamala Harris joked about killing Mike Pence and Donald Trump? Or the one about large numbers of Trump votes being secretly switched to Joe Biden? If stories like this, run by fake news sites such as Infowars or Gatewaypundit, popped up in your social media feed about the 2020 U.S. presidential election, they probably came from a tiny group of people with a massive impact.
A mere 2000 or so “supersharers” spread 80% of content from fake news sites in a sample of more than 600,000 U.S. voters on X (formerly Twitter), according to an analysis published today in Science. The posters were more likely to be women and older—challenging the stereotype of social media manipulators as young, alt-right men—and they had a huge reach: More than one in 20 users in the data set followed at least one of these supersharers.
The research “is a valuable addition to our understanding of who shares unreliable news on social media,” says Brendan Nyhan, a political scientist at Dartmouth College who was not involved in the work. It also points to a possible solution, he says: “Simple limits on retweets would constrain the spread of this information while having little effect on the vast majority of users.”
The new findings back up previous studies. In 2019, for example, Nir Grinberg, a computational social scientist at Ben-Gurion University of the Negev, and colleagues showed that in a sample of more than 16,000 Twitter users taken around the 2016 U.S. presidential election, 80% of tweeted news from untrustworthy websites came from just 16 users. But who were these superspreaders?
To find out, Grinberg’s team dove into a far bigger data set comprising 660,000 U.S. X users who used their real name and location, allowing the researchers to match them with voter registration data. About 7% of all political news shared by these users on any given day came from untrustworthy websites such as Infowars and Gatewaypundit, the researchers found. And just 2107 users were spreading 80% of the fake news.
The average supersharer was 58 years old, 17 years older than the average user in the study, and almost 60% were women. They were also far more likely to be registered Republicans (64%) than Democrats (16%). Given their frenetic social media activity, the scientists assumed supersharers were automating their posts. But they found no patterns in the timing of the tweets or the intervals between them that would indicate this. “That was a big surprise,” says study co-author Briony Swire-Thompson, a psychologist at Northeastern University. “They are literally sitting at their computer pressing retweet.”
“It does not seem like supersharing is a one-off attempt to influence elections by tech-savvy individuals,” Grinberg adds, “but rather a longer term corrosive socio-technical process that contaminates the information ecosystem for some part of society.”
The result reinforces the idea that most misinformation comes from a small group of people, says Sacha Altay, an experimental psychologist at the University of Zürich not involved with the work. “Many, including myself, have advocated for targeting superspreaders before.” If the platform had suspended supersharers in August 2020, for example, it would have reduced the fake election news seen by voters by two-thirds, Grinberg’s team estimates.
Another way to limit supersharing would be to cap users’ daily number of retweets. If set at 50, a cap would affect close to 90% of the fake news supersharers in the study, the researchers found, whereas only 1% of users overall would run into this limit. “I do not see a lot of benefit in allowing people to send unrestricted amounts of retweets in a day,” Grinberg says. And the limit would not have to not be absolute, says Stephan Lewandowsky, a psychologist at the University of Bristol who was not involved in the work. Instead, X could simply ask users whether they really want to retweet something, making the process a little bit more cumbersome.
Whether that kind of intervention works will depend on how motivated these misinformation spreaders are. “After the [2019] paper, the big question was: ‘Who are these supersharers?’” Swire-Thompson says. “Now the big question is: ‘Why are they doing what they’re doing?’”
No comments:
Post a Comment