4 ways Ofcom wants to change the internet for children
Ofcom has published its proposals on how social media platforms will be expected to protect children in future.
As part of the Online Safety Act, the regulator says algorithms must stop recommending harmful content to children. Robust age-checking measures should also be put in place, Ofcom said. This will, it says, mean a reduction in pornography and other harmful content being accessed by children. It will also mean potential changes to how children can use group chat apps like WhatsApp and Snapchat.
Social media platforms face a series of new legal responsibilities under the act, which will require sites which can be accessed by children to take action to protect those younger users by assessing the risk their platform poses to them and then putting in place measures to mitigate those risks. Large fines are among the possible penalties for those found to be in breach.
Ofcom is the new regulator for the sector and has set out how platforms should handle different types of content, ahead of the new rules beginning to come into full force, which is expected towards the end of this year. The latest codes include more than 40 practical measures which Ofcom says will demand a step-change from tech firms by compelling safer design and operating practices from the biggest sites.
Here are four ways Ofcom wants the internet to change for children:
1. Reduce access to pornography
Ofcom has outlined proposals to make it more difficult for children to access pornography. Currently, a quick search on some social media platforms can throw up pictures and videos without the need for the user to provide details of their age.
Under Ofcom’s plans, robust age checks would be expected, so that services know which of their users are children. These could include a rigorous ID check to determine the exact age of a user.
2. Limit access to other harmful content
Any harmful content that is not banned should again be subject to “highly effective age checks” to prevent children from seeing it.
Algorithms that provide personalised or ‘recommended’ content should also be adjusted so that any potentially harmful content would be filtered out of the feed of any child user, Ofcom said.
Safe search settings under moderation guidelines would also be expected to be put in place, which would not be able to be turned off by child users. These steps should reduce any harmful content – that would include, as well as pornography, content that encourages or promotes instructions for suicide, self-harm or eating disorders. Content that incites hatred, bullying, violence or dangerous stunts would also fall under this category.
3. Group chat settings
Apps like WhatsApp and Snapchat allow young people to engage in group chats with friends, or anyone else that may be added. However, Ofcom says that children should in future have to give their consent instead of being added to any chat thread automatically.
Platforms should give users the option to not only decline group invites but to be able to block and mute user accounts or disable comments on their own posts. Ofcom says that this is intended to prevent instances of potential online bullying, that can often spill over into the real world.
4. Complaints process
Many platforms already have reporting and complaint functions available to all users but Ofcom says their research suggests that children do not find these easy to use and transparent. As a result, children are discouraged from making complaints about content they may have seen.
New measures put forward by Ofcom include improving the accessibility of forms to offer “clear, straightforward, and accessible complaints procedures”. Platforms will have to respond to complaints promptly and provide information about the resolution – so as to provide assurance to children that their complaints are confidential and are seen and responded to.
Marking their own homework?
The Ofcom chief executive, Dame Melanie Dawes, claims their measures “go way beyond current industry standards”. She said the regulator “won’t hesitate to use our full range of enforcement powers to hold platforms to account”.
However, while charities like the NSPCC have cautiously welcomed them, there have been some criticism that Ofcom’s proposals essentially meant big tech companies were being “allowed to mark their own homework”. Speaking on BBC Radio 4’s Today programme, Dawes denied this was the case, insisting Ofcom would be doing the marking themselves transparently.
Child online safety campaigner Ian Russell, the father of 14-year-old Molly Russell who took her own life in November 2017 after viewing harmful material on social media, said more still needed to be done to protect young people from online harms. In his role as chair of online safety charity, the Molly Rose Foundation, Mr Russell said: “Ofcom’s task was to seize the moment and propose bold and decisive measures that can protect children from widespread but inherently preventable harm.
“The regulator has proposed some important and welcome measures, but its overall set of proposals need to be more ambitious to prevent children encountering harmful content that cost Molly’s life.”
He added that the next prime minister should commit to “finish the job and strengthen the Online Safety Act to give children and families the protection they deserve”.
Lisa Kenevan, who believed her 13-year-old son Isaac died after taking part in a dangerous TikTok challenge, told BBC Breakfast that Ofcom still does not have the power to look at individual complaints on social media platforms.
Hollie Dance, whose son Archie Battersbee died at the age of 12 in 2022 during a prank, that she believed was part of a TikTok challenge, questioned whether Ofcom could ensure platforms were imposing rigours age checks. Dawes admitted that the platforms “are not doing enough” but insisted Ofcom would hold any who do not follow age restriction rules to account.