In the UK, there has been a significant rise in reported cases of online sextortion among children, prompting advocates to call on technology companies to enhance their measures against this crime.
The Report Remove service, which enables minors to report intimate images or videos of themselves that may have been shared online, recorded 394 instances of blackmail related to sextortion last year. This number represents a 34% increase compared to 2024.
Sextortion refers to the coercion of individuals into sending explicit images or videos to predators, who subsequently threaten to release these materials unless the victims provide money or additional intimate content. Notably, 98% of the reported victims were boys aged between 14 and 17.
This form of exploitation has been linked to tragic incidents, including the suicides of several British teenagers who took their lives following extortion threats. The parents of one such victim, 16-year-old Murray Dowey from Dunblane, have initiated legal action against Meta, the parent company of Facebook and Instagram, for allegedly lacking adequate protective measures.
The Molly Rose Foundation (MRF), a charity focused on internet safety, emphasized that technology firms need to take more decisive action to counteract blackmail attempts and pressed the government to enforce stricter regulations on social media platforms, where grooming often begins. MRF also suggested that Apple and Google should implement technologies to detect nudity on their devices.
The Internet Watch Foundation (IWF), which co-manages the Report Remove service alongside the NSPCC’s Childline, echoed the call for nudity detection capabilities on devices.
“It is evident that if companies are unwilling to act independently, government intervention is necessary to ensure they do,” stated Hannah Swirsky, head of policy at the IWF.
The NSPCC remarked that the statistics indicate a need for mandatory anti-nudity detection features on devices.
Ros Dowey, Murray’s mother, expressed her dismay at the latest figures, stating, “It is shocking and disappointing. What will it take for social media companies to assume responsibility when they are fully aware that crimes are occurring on their platforms but still fail to implement necessary safety measures?”
Murray’s father, Mark, noted that they would continue to advocate for awareness around the issue until their case against Meta is resolved in court. “It seems public sentiment is shifting against social media platforms, yet these companies continue to enable harm, despite their claims of enhancing safety measures. If those measures were effective, we would see a decline in these figures.”
Sextortion perpetrators utilize various platforms in addition to techniques employed by social media companies, including sharing alerts about threats to child safety coordinated by the Tech Coalition group.
When a child utilizes the Report Remove service to upload an explicit image, the service converts the image into a digital fingerprint, or “hash,” which is then shared with major tech companies to facilitate its removal or prevent future uploads. Importantly, the actual image is not shared with these companies.
Kerry Smith, the chief executive of the IWF, commented on the profound implications of the recent sextortion data on children’s lives, noting that many victims might remain unaware of the Report Remove service and have not sought help. “Criminals are casting a wide net and targeting young people with severe threats,” she stated. “They employ emotional manipulation and escalate their threats quickly once they receive intimate images.”
Shaun Friel, director of Childline, pointed out a positive aspect of the rising numbers: increased awareness among children about the Report Remove platform and their willingness to utilize it. This service empowers young people to “regain control,” he said.
The Report Remove data for 2025 indicated a 66% rise in minors seeking assistance, with 1,175 out of 1,894 reports involving content deemed as child sexual abuse material. Additionally, the number of videos reported rose 27% to 509.
A spokesperson for Google affirmed the company’s commitment to combating sextortion, highlighting investments in “industry-leading” protective measures. “Addressing this issue requires ongoing efforts, and our strategies are constantly adapting based on input from survivors and experts,” they stated.
Apple declined to provide comments but operates a “communication safety” system that alerts minors when they receive or attempt to send potentially explicit content. Google Messages, the default messaging application on Google devices, offers an option to enable warnings for sensitive content.
Meta did not comment on the situation but has introduced a nudity protection feature that blurs explicit images sent via direct messages. The company is also contesting the lawsuit brought by the Dowey family in the UK and other families in the US.
Jess Phillips, the minister for safeguarding and violence against women and girls, acknowledged the distressing reality that child sexual abuse and sextortion have led to suicides, expressing her commitment to enhancing online safety for children.
Last year, it was reported that the UK government was seeking to compel tech companies to block explicit images and was considering making this a requirement for devices sold in the country.
In the UK, individuals in need of support can contact the youth suicide charity Papyrus at 0800 068 4141 or via email at pat@papyrus-uk.org. The Samaritans can be reached in the UK and Ireland at freephone 116 123 or through email at jo@samaritans.org or jo@samaritans.ie. In the US, assistance is available through the 988 Suicide & Crisis Lifeline by calling 988 or through chat support.

















