UK Regulator Ofcom Investigates Telegram for Potential Failures in Preventing Child Sexual Abuse Material
The UK media regulator Ofcom has initiated an investigation into the messaging app Telegram due to evidence suggesting the presence and sharing of child sexual abuse material on the platform. Telegram denied the accusations and stated it has measures in place to address such content. The probe is part of broader enforcement under the UK's Online Safety Act.
order-order.comThe UK media regulator Ofcom announced on Tuesday that it has launched an investigation into Telegram over concerns that the messaging service may not be adequately preventing the sharing of child sexual abuse material (CSAM). Ofcom stated it gathered evidence indicating CSAM was present and being shared on the platform.
Under UK law, user-to-user services must implement systems to prevent users from encountering CSAM and other illegal content, with potential fines for non-compliance. Telegram stated it categorically denies Ofcom's accusations. The company said that since 2018, it has used detection algorithms and cooperated with non-governmental organizations to eliminate the public spread of CSAM on its platform.
added that it is surprised by the investigation and concerned it may relate to broader actions against platforms supporting freedom of speech and privacy. Suzanne Cater, director of enforcement at Ofcom, said child sexual exploitation and abuse causes harm to victims and that addressing it is a high priority for the regulator.
She noted progress in tackling CSAM on smaller services but stated the issue affects larger platforms as well. The children's charity NSPCC welcomed the investigation. Rani Govender, associate head of policy at NSPCC, said recent research showed around 100 child sexual abuse image offenses recorded by police daily, and the organization supports increased action to address the issue.
The Internet Watch Foundation (IWF), which identifies and removes CSAM online, also welcomed the probe. Emma Hardy, communications director at IWF, said the organization shares concerns about networks distributing CSAM on Telegram and that while some actions have been taken, more safeguards are needed, including in end-to-end encrypted chats.
Related Investigations Ofcom said it initiated the Telegram probe after being contacted by the Canadian Centre for Child Protection regarding alleged CSAM on the app. The regulator also announced investigations into Teen Chat and Chat Avenue over potential grooming risks identified through work with child protection agencies.
Suzanne Cater said teen-focused chat services are being used by predators to groom children and that firms must take further protective measures or face consequences under the Online Safety Act. The firm behind Teen Chat stated it disagrees with Ofcom's position and highlighted its systems, including human moderation, content reporting, and chat filters, to prevent illegal activity.
The company added it is working with Ofcom but has nearly reached the limit of what can be expected from a small platform. The Online Safety Act's illegal content duties, effective since March 2025, require services like messaging apps to address priority illegal content, including CSAM, terrorism, grooming, and extreme pornography.
Ofcom has the authority to fine non-compliant companies up to £18 million or 10% of global revenues, whichever is higher. The regulator has issued fines to other providers for failures in illegal content handling or age checks. One file-sharing service contacted by Ofcom made improvements to its systems for dealing with illegal content, according to the regulator.
Key Facts
Story Timeline
3 events- Tuesday
Ofcom announced investigations into Telegram, Teen Chat, and Chat Avenue over CSAM and grooming concerns.
1 sourceBBC News - March 2025
The Online Safety Act's illegal content duties took effect, requiring services to address priority illegal content.
1 sourceBBC News - 2018
Telegram began using detection algorithms and cooperating with NGOs to eliminate public spread of CSAM.
1 sourceBBC News
Potential Impact
- 01
Telegram may face fines if found non-compliant with UK online safety rules.
- 02
Other messaging platforms could see increased regulatory scrutiny and required system improvements.
- 03
Child protection organizations may report more CSAM incidents, leading to further investigations.
- 04
Small platforms like Teen Chat might implement additional moderation to avoid penalties.
- 05
Public awareness of online safety could rise, prompting user behavior changes on affected apps.
Transparency Panel
Related Stories
insurancejournal.comMajor Publishers and Author Sue Meta Over Alleged Use of Copyrighted Works in Llama AI Training
Five major publishing houses and author Scott Turow filed a lawsuit against Meta in Manhattan federal court, accusing the company of pirating millions of copyrighted works to train its Llama AI models. The suit claims Meta CEO Mark Zuckerberg personally authorized the infringemen…
naturalnews.comBrockman Testifies on Heated 2017 Dispute with Musk Over OpenAI's For-Profit Shift in Federal Trial
OpenAI President Greg Brockman detailed a heated 2017 confrontation with Elon Musk during testimony in the federal trial Musk v. Altman. He described Musk storming around a table and grabbing a painting after rejecting shared control proposals. The lawsuit seeks $150 billion in d…
Trump Administration Explores Government Review of AI Models Before Public Release
The Trump administration is discussing measures to vet advanced AI models for safety and security risks prior to their release, marking a potential shift from its previous hands-off stance on AI regulation. Officials are considering an executive order to establish a working group…