Online Child Exploitation
Virtual Reality: The disturbing new tool paedophiles are using to perpetrate child abuse
Feb 22, 2023
“Incredibly alarming, but reflect just the tip of the iceberg”
The NSPCC says the figures are “incredibly alarming, but reflect just the tip of the iceberg” of what children are experiencing online and how Paedophiles are looking to exploit new technologies such as Virtual Reality.
Paedophiles are turning to virtual reality headsets to view child abuse images, according to a recent report by the National Society for the Prevention of Cruelty to Children (NSPCC). The data shows that the use of virtual reality technology was recorded in eight cases during 2021/22, marking the first time this technology has been specifically mentioned in crime reports.
These findings come as police recorded a staggering 30,925 offences involving obscene images of children, the highest number of cases ever recorded by forces in England and Wales. Shockingly, social media or gaming sites were mentioned in 9,888 cases, with Snapchat accounting for 4,293 cases, Facebook 1,361, Instagram 1,363, and WhatsApp 547.
The tip of the iceberg
The NSPCC has collated this data and is now calling for a number of amendments to the Online Safety Bill to prevent more children from being exposed to abuse. The figures are alarming, but they only represent the tip of the iceberg of what children are experiencing online, according to Sir Peter Wanless, chief executive of the NSPCC.
“We hear from young people who feel powerless and let down as online sexual abuse risks becoming normalised for a generation of children,” Sir Peter says. “By creating a child safety advocate that stands up for children and families, the government can ensure the Online Safety Bill systemically prevents abuse.”
In addition to advocating for a child safety advocate, the NSPCC is also calling for a change in the law that would make senior managers of social media sites criminally liable if children are exposed to abuse. “It would be inexcusable if in five years’ time we are still playing catch-up to pervasive abuse that has been allowed to proliferate on social media,” Sir Peter warns.
Response from social media companies
A Meta spokesperson says, “This horrific content is banned on our apps, and we report instances of child sexual exploitation to the National Centre for Missing & Exploited Children (NCMEC). We lead the industry in the development and use of technology to prevent and remove this content, and we work with the police, child safety experts, and industry partners to tackle this societal issue.”
Similarly, Snapchat has dedicated teams around the world working closely with the police, experts, and industry partners to combat the sexual abuse of children. “If we proactively detect or are made aware of any sexual content exploiting minors, we immediately remove it, delete the account, and report the offender to authorities,” a Snapchat spokesperson says. “Snapchat has extra protections in place that make it difficult for younger users to be discovered and contacted by strangers.”
These efforts are commendable, but the fact that the number of child abuse cases continues to rise underscores the need for urgent action. It is crucial that the government and social media companies work together to implement effective measures to protect children from online abuse