Advertisement
Engadget has been testing and reviewing consumer tech since 2004. Our stories may include affiliate links; if you buy something through a link, we may earn a commission. Read more about how we evaluate products.

Facebook's own research warned its algorithms exploit 'divisiveness'

But company executives reportedly resisted efforts to make it less polarized.

Chip Somodevilla via Getty Images

Facebook’s executives resisted internal efforts to make its platform less divisive, according to The Wall Street Journal.

The company’s internal research found that its algorithms “exploit the human brain’s attraction to divisiveness,” and could make the social network a more polarized place. In another piece of research reported by the WSJ, Facebook found in 2016 that 64 percent of users joining extremist groups on the platform did so as a result of the company’s recommendation algorithms.

The following year Facebook launched “Common Ground,” a wide-ranging effort to make the social network less polarizing and “increase empathy” among users.

Yet the company ultimately decided not to act on many of the group’s recommendations, fearing it could fuel claims the company was biased against conservative viewpoints. Other proposed changes were watered down, according to the report.

Facebook’s policy chief, Joel Kaplan, was reportedly particularly concerned with whether changes to News Feed rankings could further anger conservatives who already believed the company was biased against them. The WSJ also reports that Zuckerberg's interest in this work was only “fleeting,” noting that more recently he has become a more vocal proponent of free speech and has pushed back on the idea that social media makes people more divided.

In a statement, a Facebook spokesperson said the company had “learned a lot since 2016,” and noted Facebook has funneled $2 million into outside research on polarization. “We’ve learned a lot since 2016 and are not the same company today,” the spokesperson said. “We’ve built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform's impact on society so we continue to improve.”

Facebook has long grappled with questions over how it should police speech and what type of content it should be amplifying. The company recently introduced the first members of its content oversight board, a group of outside experts that will help Facebook deal with its most controversial content decisions. The company has been criticized internally for controversial policies that allow politicians to spread unchecked misinformation.

But Zuckerberg has made it clear he doesn’t care much about “being liked,” especially when it comes to issues of free speech. The CEO said at the beginning of the year he intended to spend the next decade “communicating our principles” even when they are unpopular.

Update 5/26 8:13pm ET: Updated to add Facebook’s statement.