Social media sites must not be 'safe haven' for illegal content, Ofcom warned
Technology Secretary Peter Kyle has said out new measures to Ofcom to make sure the UK keeps pace with tech as it evolves to create a safer internet especially for kids
by Sophie Huskisson · The MirrorThe must be “no safe havens” for illegal content such as disinformation, fraud or child sexual exploitation on online platforms, Ofcom has been told.
The media regulator has been ordered by ministers to prioritise five key areas as it starts enforcing new online safety laws from next Spring. Among them is to check whether safety is baked into the design of social media platforms to ensure harms are caught before they occur. It must also ensure transparency and accountability are upheld in the tech industry and that the digital world is inclusive and resilient to harms like disinformation.
Ofcom must report back on whether the five priorities can be effectively addressed through the Online Safety Act. The new law will see safety duties placed on social media and other platforms for the first time. It will require sites to protect users, and in particular children, from harmful content, with large fines for those who fail to follow the new rules.
The Act is expected to face resistance from US tech firms amid Donald Trump ’s presidential election win. Twitter /X owner Elon Musk, who stripped back moderation from the social media site, has been appointed Mr Trump’s government efficiency tsar. His site has the biggest proportion of disinformation out of the biggest social media platforms, research showed last year.
Alongside setting out instructions for Ofcom, ministers have announced an official study into the impact of social media and smartphones on kids. It comes as the grieving parents of a British teenager who took his own life made a public plea to criminals this week to stop “terrorising” vulnerable people. Murray Dowey, 16, is thought to have ended his life after being tricked into sending intimate photos of himself and then blackmailed - a crime known as Sextortion.
Ian Russell, whose daughter Molly took her own life at age 14 after being bombarded with harmful content online, said Wednesday's announcement “outlines a much needed course correction, vital for improved online safety, and to prevent the new regulation falling badly short of expectations”.
But the campaigner, who is chair of trustees at suicide prevention charity the Molly Rose Foundation, added: "While this lays down an important marker for Ofcom to be bolder, it is also abundantly clear that we need a new Online Safety Act to strengthen current structural deficiencies and focus minds on the importance of harm reduction.”
Maria Neophytou, director of strategy and knowledge at the NSPCC, said: “Tech companies must be transparent about the harm happening on their platforms. They should be disrupting ‘safe havens’ for offenders by tackling the hidden abuse taking place through private messaging.”
Technology Secretary Peter Kyle said: “From baking safety into social media sites from the outset, to increasing platform transparency, these priorities will allow us to monitor progress, collate evidence, innovate, and act where laws are coming up short. While the Online Safety Act sets the foundation of creating better experiences online, we must keep pace with technology as it evolves to create a safer internet, especially for children.”
Mr Kyle's Statement of Strategic Priorities will be laid in Parliament for approval, before coming into force in Spring next year.