Social media 'bots' from Russia distorting global politics: study

A wave of "computational propaganda," largely driven by Russia, is impacting politics around the world by spreading misinformation designed to manipulate public opinion, researchers said Tuesday. The Oxford University team presented research in Washington on the use of automated programs or "bots" on social media aimed at influencing politics in nine countries, including the United States. "Computational propaganda is one of the most powerful new tools against democracy," said the research paper directed by Oxford's Philip Howard and Samuel Woolley. The research is not the first to note the existence of Twitter bots and other automated tools aimed at disrupting politics but offers insight into the global scale of efforts, which are traced mainly to Russia but also operate in China and in the target countries themselves. "We know that there is a building with hundreds of employees in St. Petersburg with a budget of millions of dollars dedicated to manipulating public opinion" in a number of countries, Howard said at a media presentation. Howard said the Russian style of propaganda involves "seeding multiple, conflicting and contradictory stories." Woolley said the goal of this effort "is to confuse, it's not necessarily to sell a fake story. It's to make people so apathetic about politics and policy in general that they don't really want to engage anymore." The research team analyzed tens of millions posts on seven different social media platforms during elections, political crises, and national security incidents between 2015 and 2017 in Brazil, Canada, China, Germany, Poland, Taiwan, Russia, Ukraine, and the United States. - Social media battles - While propaganda and fake news are longstanding tools in politics, the use of automation and algorithms to create bots on social media appears to have accelerated the spread of misinformation. Platforms like Facebook and Twitter have taken steps to curb the spread of fake news stories while also arguing it is not their role to edit or control content. The researchers said Twitter is more vulnerable to bots because it allows users to set up anonymous accounts and its programming platform is open. In the United States, the researchers said they concluded that bots had "measurable influence" during the 2016 election by affecting the flow of information. "Social media bots manufacture consensus by artificially amplifying traffic around a political candidate or issue," the researchers wrote. "Armies of bots built to follow, retweet, or like a candidate's content make that candidate seem more legitimate, more widely supported, than they actually are... the illusion of online support for a candidate can spur actual support through a bandwagon effect." In Russia, the researchers said they found 45 percent of the political conversation is dominated by "highly automated accounts." While Twitter was an effective tool for pro-democracy activists during the Arab Spring movements starting in 2010, the researchers say authoritarian governments now use these platforms to suppress social activism. Perhaps the most flagrant examples of computational propaganda are in Ukraine, they said, describing it as "the frontline of numerous disinformation campaigns in Europe." They said fake stories such as one about "a crucified boy" or another about Ukrainian soldiers being paid with "two slaves and a piece of land" have turned into "textbook examples of how propaganda works."