Advertisement

Facebook to verify the identities of some user profiles behind viral posts

Lost in the news of the George Floyd protests against police brutality and racism in the U.S., Facebook last week quietly noted it will now require Facebook profile pages with large followings in the U.S. to verify their identity. The company said that profiles with sizable audiences that also have a pattern of inauthentic behavior and whose posts rapidly go viral will be asked to verify their identity or the distribution of their posts will be impacted.

If the profile's owner chooses not to verify their identity or the ID provided does not match the linked Facebook account, the distribution of the profile's viral post will continue to be reduced so fewer people will see it, the company explains.

In addition, if the profile that's posting is also a Facebook Page admin, they'll need to complete the Page Publishing Authorization and won't be able to post from their Page until the account is verified through Facebook's systems.

The company said the move to verify profiles is about transparency.

"We want people to feel confident that they understand who’s behind the content they’re seeing on Facebook and this is particularly important when it comes to content that’s reaching a lot of people," Facebook said in a Thursday announcement.

Identify verification is not new to Facebook, but the company's use of the process has increased in recent months, following the revelation that Russia-backed content reached as many as 126 million Americans on Facebook’s platform during and after the 2016 presidential election.

To address the issue, Facebook in August 2018 rolled out a new process that involved having Facebook Page managers secure their accounts and verify their locations. In December 2018, the resulting "People Who Manage This Page" section had rolled out to all Facebook Pages with a large audience. Also in 2018, Facebook began demanding ID verification for political "issue" ad buyers on debated topics of national legislative importance.

Similar authentication and verification tools also rolled out to Instagram in 2018. And this April, both Facebook and Instagram began to display the location of the Facebook Page or Instagram account with a large audience on every post it shares. The company believes this transparency will allow users to better determine the reliability and authenticity of the accounts.

The timing of Facebook's announcement about profile verification is worth noting. It arrived on the same day that Trump signed an executive order taking direct aim at social media companies, which targeted the legal structure they rely on to shield themselves from legal liability for their user-created content. It was also the same day that Facebook CEO Mark Zuckerberg appeared on Fox News to explain why it didn't take the same action as Twitter did when it fact-checked Trump.

In a way, the change is an attempt for Facebook to showcase that it does in fact moderate its platform, by reducing the spread of viral posts from unverified sources. And that's something the company can later point to when questioned by regulators as to how it's addressing problems with bots and others who try to conceal their identity as they manipulate Facebook to spread their viral content.