Facebook has once again found itself stamping out false information that could foster violence, if not on a topic where it’s normally an issue. Policy communications lead Andy Stone has revealed (via Gizmodo) that Facebook removed false claims extremists had started Oregon’s wildfires. Fire and police departments were being forced to divert resources away from firefighting due to the misinformation, Stone said.
A spokesperson talking to Gizmodo added that the social media giant was applying “strong” warnings to existing posts and limiting their distribution.
As with the company’s response to a Wisconsin militia group in light of a deadly shooting, though, there are concerns Facebook didn’t move quickly enough. Law enforcement warned about wildfire misinformation roughly two days before the crackdown, on September 10th, around the same time when a key Law Enforcement Today article spurred the false claims. While it’s unclear how much of an impact this made, it theoretically allowed the situation to escalate.
The article had over 70,000 shares and 360,000 interactions as of the evening of September 12th, CrowdTangle data suggested.
Whether or not Facebook is acting reasonably quickly, the crackdown illustrates the challenges the company faces when dealing with misinformation. It now has to deal with potentially deadly falsehoods across a wide variety of subjects, not just hate speech or conspiracy theories. And in the current climate, its work isn’t about to get any easier.
This is consistent with our past efforts to remove content that could lead to imminent harm given the possible risk to human life as the fires rage on. (2/2)— Andy Stone (@andymstone) September 12, 2020