Tech firms face demands to stop illegal content going viral



Tech platforms could be forced to prevent illegal content from going viral and limit the ability for people to send virtual gifts to or record a child’s livestream, under more online safety measures proposed by Ofcom.

The UK regulator published a consultation on Monday seeking views on further protections to keep citizens, particularly children, safer online.

These could also include making some larger platforms assess whether they need to proactively detect terrorist material under further online safety measures.

Oliver Griffiths, online safety group director at Ofcom, said its proposed measures seek to build on existing UK online safety rules but keep up with “constantly evolving” risks.

“We’re holding platforms to account and launching swift enforcement action where we have concerns,” he said.

“But technology and harms are constantly evolving, and we’re always looking at how we can make life safer online.”

The consultation highlighted three main areas in which Ofcom thinks more could be done:

  • stopping illegal content going viral
  • tackling harms at source
  • giving further protections to children

The BBC has approached TikTok, livestreaming platform Twitch and Meta – which owns Instagram, Facebook and Threads – for comment.

Ofcom’s range of proposals target a number of issues – from intimate image abuse to the danger of people witnessing physical harm on livestreams – and vary in what type or size of platform they could apply to.

For example, proposals that providers have a mechanism to let users report a livestream if its content “depicts the risk of imminent physical harm” would apply to all user-to-user sites that allow a single user to livestream to many, where there may be a risk of showing illegal activity.

Meanwhile potential requirements for platforms to use proactive technology to detect content deemed harmful to children, would only apply to the largest tech firms which present higher risks of relevant harms.

The proposals put forward by Ofcom look to expand upon the measures already in place to try and improve online safety.

Some platforms have already taken steps to try and clamp down on features that experts have warned may expose children to grooming, such as through livestreaming.

In 2022, TikTok banned children raised its minimum age for going live on the platform from 16 to 18 – shortly after a BBC investigation found hundreds of accounts going live from Syrian refugee camps with children begging for donations.

YouTube recently said it would increase its threshold for users to livestream to 16, from 22 July.

But some groups say the regulator’s potential new requirements highlight core issues with the Online Safety Act – the UK’s sweeping rules that Ofcom is tasked with enforcing.

“Further measures are always welcome but they will not address either the systemic weaknesses in the Online Safety Act,” said Ian Russell, chair of the Molly Rose Foundation – an organisation set up in memory of his 14-year-old daughter Molly Russell, who took her own life after viewing thousands of images promoting suicide and self-harm.

“As long as the focus is on sticking plasters not comprehensive solutions, regulation will fail to keep up with current levels of harm and major new suicide and self-harm threats,” Mr Russell said.

He added that Ofcom showed a “lack of ambition” in its approach to regulation.

“It’s time for the prime minister to intervene and introduce a strengthened Online Safety Act that can tackle preventable harm head on by fully compelling companies to identify and fix all the risks posed by their platforms.”

Leanda Barrington-Leach, executive director of children’s rights charity 5Rights, said the regulator should require companies to “think more holistically” about safeguards for children, rather than mandate “incremental changes”.

“Children’s safety should be embedded into tech companies’ design of features and functionalities from the outset,” she said.

But the NSPCC’s Rani Govender said Ofcom’s move to require more safeguards for livestreaming “could make a real difference to protecting children in these high-risk spaces”.

The consultation is open until 20 October 2025 and Ofcom hopes to get feedback from service providers, civil society, law enforcement and members of the public.

Additional reporting by Chris Vallance


Source link


Leave a Reply

Your email address will not be published. Required fields are marked *