Saturday 25 August 2018

Russian trolls and Twitter bots exploit vaccine controversy

Russian trolls and Twitter bots exploit vaccine controversy

A nurse prepares to administer the measles, mumps and rubella vaccine at Children’s Primary Care Clinic in Minneapolis. (Courtney Perry for The Washington Post) By Carolyn Y. Johnson , Reporter August 23 at 3:54 PM Public health experts battling dangerous misinformation about the safety of vaccines have a new foe: Twitter bots and Russian trolls .
Researchers found bots and Russian trolls mentioned vaccines more often than the average Twitter account over a three-year period, but for different reasons. Russian trolls stoked the debate by tweeting pro- and anti-vaccine messages in an apparent attempt to sow division, while bots that spread malicious software appeared to use anti-vaccine messages that inflame strong responses from both sides to attract clicks.
“Apparently only the elite get ‘clean’ #vaccines. And what do we, normal ppl, get?! #VaccinateUS,” a Russian troll account tweeted in one of the messages that stood out to researchers because of the unusual line it drew between vaccine fearmongering and income inequality.
“That’s not something you see from an antivaxxer,” said David Broniatowski, an engineer at George Washington University who led the research published in the American Journal of Public Health . “Elites getting clean vaccines — we thought that was very unique to the Russian trolls and could be interpreted as an attempt to link vaccination to a specific division within American society.”
The trolls were users connected to a Russian propaganda effort run by the Internet Research Agency, a troll farm. They were identified by the researchers through a list released by Twitter to Congress that was further refined by NBC . Twitter bots were identified through bot repositories and spanned a range including spambots that are clearly non-human and “content polluters” that spread malware or try to scam people.
Between 2014 and 2017, Twitter accounts that have been identified as Russian trolls were 22 times as likely to tweet about vaccines than the average user. The tweets fell on both sides of the vaccine debate and appeared aimed at sowing discord by amplifying the debate — even though the vast majority of Americans believe vaccines’ benefits outweigh their risks. The tweets also appeared to link the debate to other divisions within American society, such as class and racial divisions.
[ There’s a surprisingly simple way to convince vaccine skeptics to reconsider ]
But these didn’t appear to be anti-vaxx accounts mounting an attack on American public health by design; only one out of 550 tweets by Russian trolls touched on vaccines, the researchers found.
These trolls may be flitting between divisive topics. Renee DiResta , who studies computational propaganda with the grass-roots collective Data for Democracy, said her own research suggests Russian trolls have fixated on the vaccine debate not to seed a public health crisis but to exploit its divisiveness.
Tweets she discovered from September 2017, for example, connect vaccine denial to racial anxieties: “Diseases Expert Calls for White Genocide Since Most Vaccine Deniers are White,” several Russian trolls tweeted.
“It’s opportunism — opportunistically amplifying controversial topics,” DiResta said.
She pulled up one Russian bot, @WadeHarriot, that bounced around between various themes — anti-gay, pro-Ted Cruz, anti-Obama or pro-Trump until in early 2017 it began spouting anti-vaccination ideas. Only 38 of that user’s 6,000 tweets were about vaccines. They began right around the time that the topic surfaced in the general news cycle because vaccine critic Robert Kennedy Jr. announced that President Trump asked him to head a new commission on vaccines. That commission never came into being.
Bots focused on “content pollution” — circulating malicious software — were more likely to talk about the harms of vaccines, suggesting they may be trying to take advantage of a high-profile topic to get people to click on a link and accidentally download malware or be sent to a fraudulent website that would ask for a password. Run-of-the-mill spambots that are clearly not human, however, were less likely to talk about vaccines than average users.
Researchers could not explain to what extent this information stream influenced people’s behaviors or contributed to general anti-vaccine sentiment.
Their work strikes a cautionary note about the best way to fight misinformation. Because Russian trolls may be trying to stir up debate, battling them directly could run the risk of “feeding the trolls,” Broniatowski said, polluting social media with angry messages that could amplify the discussion even more and give the false impression that the sides are evenly split.
Public health officials tend to focus on education to combat anti-vaccine messages, but given this content isn’t always coming from real people, tech companies’ ongoing efforts to weed out bots and accounts misusing their platforms could be another powerful tool.
The vast majority of vaccine-related tweets analyzed in the study came from users that couldn’t conclusively be categorized as people, cyborgs or bots by a tool called “Botometer.” Those actors might be people who really believe vaccines are dangerous, trolls trying to exploit the controversial topic to deepen divisions in American society, or bots using controversy as clickbait.
In response to the study, Twitter shared details of that work, pointing to a blog post that described its efforts to create tools that can identify “spammy or automated accounts automatically.” In May, the company said it challenged more than 9.9 million such accounts per week. The company also takes measures to ensure automated content can’t be discovered by general users.
Jon-Patrick Allem, a research scientist at the Keck School of Medicine at the University of Southern California, said that to design effective interventions, it will be necessary to understand how or whether this online behavior affects real people — and that much of the research into public health messages on social media is still stuck in the first phase of quantifying the problem.
“The research I’ve been exposed to — and a lot of it is in the works right now — shows that Russian-backed accounts were designed with the intention to create discord on very particular and very nuanced issues in American culture,” Allem said.
But the risk of false information isn’t simply one from foreign governments. Allem has found social bots are much more likely to put out messages that they quit smoking with the help of e-cigarettes than real accounts — even though there isn’t convincing evidence that they help smokers quit. He also worries bots could disseminate incorrect information for a variety of interests, perhaps spreading misinformation about how effective medications are or promoting supplements and vitamins.
“It’s hard to identify the puppet master behind social bots,” Allem said.
Read More:
The moral differences between pro- and anti-vaccine parents
Kids in these U.S. hot spots at higher risk because parents opt out of vaccinations
How rubella helped change Americans’ views of abortion

Read More…

The post Russian trolls and Twitter bots exploit vaccine controversy appeared first on Denny Boy’s Interesting Blog.



from Denny Boy’s Interesting Blog https://ift.tt/2BNlXhV via Article Source
from Tumblr https://ift.tt/2odDGFv

No comments:

Post a Comment