Advertisement

TikTok and Bumble join anti-revenge-porn initiative

Meta is also a partner in the fight against the spread of non-consensual intimate images.

Dado Ruvic / reuters

TikTok and Bumble are the latest tech companies to join an initiative aimed at reducing the spread of revenge porn — intimate images and videos shared without the subject's consent. They've partnered with StopNCII.org (Stop Non-Consensual Intimate Image Abuse), which hosts a tool developed in partnership with Meta. TikTok, Bumble, Facebook and Instagram will detect and block any images that are included in StopNCII.org's bank of hashes.

The website enables people to create hashes (unique digital fingerprints) of images and videos in question. This process takes place on their device. In order to protect users' privacy, the actual files aren't uploaded to StopNCII.org, only a unique string of letters and numbers.

Hashes submitted to StopNCII.org are shared with the initiative's partners. If an image or video uploaded to TikTok, Bumble, Facebook or Instagram matches a corresponding hash and "meets partner policy requirements," the file will be sent to the platform's moderation team. If moderators find that the image breaks their platform's rules, they'll remove it. The other partner platforms will block the image from being shared too.

The tool has been live for a year and more than 12,000 people have created cases to prevent intimate videos and images being shared without consent. Users have created more than 40,000 hashes to date. As Bloomberg notes, Meta partnered with SWGfL, the UK nonprofit behind the Revenge Porn Helpline, to develop StopNCII.org. SWGfL hopes that many more platforms will sign up.

The initiative builds on a pilot Meta (then known as Facebook) started in Australia in 2017 that asked users to upload revenge porn images to a Messenger chat with themselves. Meta promised to delete the images after hashing them, but the approach raised obvious privacy concerns.

TikTok and Bumble are joining the initiative amid increasing regulatory scrutiny on the former and a broader crackdown on revenge porn. The UK, for instance, plans to force platforms that host user-generated content to take down non-consensual intimate images more swiftly, as laid out in the government's Online Safety Bill.