Advertisement

Most online grooming offences in UK committed on Facebook-owned apps

More than half of online grooming offences recorded under a law that made it illegal to send sexual messages to children were committed on Facebook-owned apps, figures reveal.

The data, obtained by the NSPCC under freedom of information laws, show 10,019 offences of sexual communication with a child were recorded since the legislation was introduced in April 2017.

In incidents where the means of communication was provided, more than half were committed on the Facebook-owned apps of Facebook, Facebook Messenger, Instagram and WhatsApp.

The revelations come at a difficult time for Facebook, which is facing calls to postpone plans to implement end-to-end encryption across all its messaging platforms. This would mean messages – including text and imagery – being digitally scrambled so that only the sender and receiver can make sense of them. The company has been told this could risk increased sexual exploitation of children.

The NSPCC has called on the prime minister to urgently press ahead with new legislation in the online harms bill, which it says would help prevent offenders from using social media to target children for sexual abuse.

The NSPCC’s chief executive, Peter Wanless, who met Boris Johnson at a “hidden harms” summit last week, said: “Child abuse is an inconvenient truth for tech bosses, who have failed to make their sites safe and enabled offenders to use them as a playground in which to groom our kids.

“Last week the prime minister signalled to me his determination to stand up to Silicon Valley and make the UK the world leader in online safety. He can do this by committing to an online harms bill that puts a legal duty of care on big tech to proactively identify and manage safety risks.

“Now is the time to get regulation done and create a watchdog with the teeth to hold tech directors criminally accountable if their platforms allow children to come to serious but avoidable harm.”

The number of offences under the 2017 law is accelerating, the figures show, with 23% recorded in the last six months of the period covered, which April 2017 to October 2019.

There were 5,784 offences where the means of communication was recorded. Facebook, Messenger, Instagram or WhatsApp was used in 3,203 instances, 55% of the total where means of communication is known, while Snapchat was used 1,060 times.

In February, the then digital minister, Matt Warman, promised to publish an online harms bill during the current parliamentary session following proposals set out in a white paper. The proposals included independent regulation of social networks with potential criminal sanctions if tech directors fail to keep children safe on their platforms.

However, a full response to the consultation on the white paper is not now expected until the end of the year and the NSPCC has expressed concerns that a regulator may not be in place until 2023.

The charity wants to see a bill that sets out a duty of care on tech firms to make their sites safer for children within 18 months.

Susie Hargreaves, the chief executive of the Internet Watch Foundation, the UK charity responsible for finding and removing online child sexual abuse material, said: “The length of time it is taking is leading to uncertainty for us all which stalls progress. We support calls for the government to speed up the timetable for the introduction of the online harms legislation.”

Facebook said that it used integrated “photo DNA” that scanned all images and videos on Instagram and Facebook and flagged known child exploitative material so it could be removed. It also said it used technology that detected child nudity and previously unknown child exploitative content when it was uploaded. This meant that 99% of child nudity content was detected by this technology and removed, Facebook said.

A spokesperson added: “There is no place for grooming or child exploitation on our platforms and we use technology to proactively find and quickly remove it. We have a content and security team of over 35,000 people investigating reports from our community and working to keep our platforms safe. Our teams also work closely with child protection experts and law enforcement, reporting content directly to specialists such as CEOP [the Child Exploitation and Online Protection Command, part of the UK’s National Crime Agency] and NCMEC [the US National Center for Missing and Exploited Children].”