Advertisement

Lofty aspirations for Facebook groups risk linking wrong people

Qanon  - SANDY HUFFAKER/AFP via Getty Images
Qanon - SANDY HUFFAKER/AFP via Getty Images

It was half-time, February 2020, and the San Francisco 49ers were tied 10-10 with the Kansas City Chiefs; the biggest prize in American football was still all to play for.

While the teams regrouped, Facebook was making its own big play. In its first-ever Super Bowl advert, the globe-spanning social network showed off a medley of Facebook groups relating to the word “rock”: extreme climbers in Utah, replica Stonehenge builders and a Rocky Balboa fan club gatecrashed by Sylvester Stallone.

“Whatever you rock”, the closing titles promised, “there is a group for you”, before flashing the slogan: “More together.”

It was the culmination of a three-year plan by chief executive Mark Zuckerberg to fix the problems that had sown such chaos in the 2016 US election. Back then, Facebook’s passion-seeking content selection algorithms had filled voters’ feeds with unchecked partisan hoaxes and Russian trolling.

Zuckerberg’s answer was to rebuild the news feed around personal connections such as friends, family and groups. If Facebook could shift 900m more users into “meaningful” groups, he declared, it would “strengthen our social fabric and bring the world closer together”. What could go wrong?

The answer may not surprise you. The “boogaloo” militia subculture, linked to at least seven terror plots and attempts at violence; the QAnon conspiracy movement; an armed confrontation in Wisconsin that killed two people; rampant false wildfire rumours that overloaded first responders on the US west coast; and widespread Covid-19 denialism. Each of these fissures have or are alleged to have roots in Facebook groups.

“I’m trying not to overstate this, but I’m more nervous about Facebook groups than any other function on any other social media platform when it comes to this election,” says Jesse Lehrich, who managed the Hillary Clinton campaign’s response to Russian meddling in 2016 and now works for Accountable Tech.

“When I have my nightmares of all the worst-case scenarios around election day, or while the vote counting is happening, Facebook groups are the worst vector of disinformation and incitement.”

Previously, Facebook had shut down groups it considered dangerous or whose members had repeatedly broken its rules, and restricted those that hosted too much fake news.

Last week, however, it launched a more radical crackdown, removing all health groups from its automatic recommendations; de-prioritising content from potentially dangerous groups in users’ news feeds; and threatening to put any member who breaks a rule within a group on a kind of 30-day cyber probation.

The problem, Lehrich argues, is that this depends on Facebook catching rule-breakers in the first place. The company does much of its policing via artificial intelligence, which peers inside righteous and wicked groups alike and collared almost 12m pieces of hate speech and hate group propaganda over the past year. But Facebook does not release any estimates of how much it might have missed.

Compared to ordinary users, groups are highly resistant to censorship. Many watch and learn from Facebook’s interventions, constantly varying their code words and swapping tips on how to avoid the all-seeing algorithm. Some also use their join-up questionnaires to interrogate new users about their beliefs, or even make them pledge never to report group content to Facebook.

Once inside, safe in a bubble of people who think like they do, members can fall foul of age-old group dynamics: confirmation bias, group-think and “group-shift”, in which group opinions become progressively more extreme over time. All of which leaves outside researchers in the dark.

“From the stories we have, the private groups seem to be particularly problematic, but we can’t measure that as researchers because we don’t have any access to them,” says Kate Starbird, a professor at the University of Washington in Seattle.

Facebook itself has often helped extremist groups grow via its own recommendation algorithms. One internal study in 2016 found that the company’s suggestions were responsible for 64pc of the intake of extremist groups. During the pandemic, as millions mouldered at home glued to their screens, these systems forged connections not only within movements but between them, linking up anti-vaccine groups with QAnon, with boogaloo.

It is an example, says Starbird, of how Facebook’s algorithms do not just recognise a pre-existing social reality but actually create new ones.

That is the crowning irony of Facebook groups. Zuckerberg has long proclaimed it his mission to “connect the world”. Now his service appears to be connecting like-minded paranoids en masse.

“It’s not just about the ideology, it’s about the community and the identity,” says Starbird, who began her career studying social media users who responded to crises, and who sees many similarities in online cults such as QAnon. “They feel like they’re doing something good for the world; that they are helping. This is a mission of theirs, and they’re doing it together.”

Or, as Facebook might put it, more together.