Almost 1,000 online grooming crimes have been recorded by police in Northern Ireland while children have been waiting for online safety laws, new figures published by the NSPCC reveal today.
Data from the Police Service of Northern Ireland (PSNI) shows 198 Sexual Communication with a Child offences were recorded last year (2022/23) – up 141% since 2017/18 when the offence came into force and the highest on record.
The new research shows that across the UK more than 5,500 offences took place against primary school children, with under-12s making up a quarter of known victims.
The new analysis of the scale and nature of child sexual abuse taking place on social media comes ahead of MPs and Lords making final decisions on the Online Safety Bill next month.
The NSPCC first called for social media regulation to protect children from sexual abuse in 2017 and has been campaigning for robust legislation ever since.
They said the number of offences and victims is likely to be far higher than those known to police. In response, they are urging politicians on all sides to support the Bill in its final stages and pass this vital legislation.
The UK-wide figures also reveal:
- The stark reality of sexual violence faced by girls on social media, with four in five (83%) of grooming cases over the six years taking place against girls, where the gender was known.
- Snapchat was used in more than a quarter (26%) of total instances over the six years, while Meta-owned products were used in almost half (47%), where the means of communication was known.
- 150 different apps, games and websites were used to target children according to the police data analysed since 2017/18.
Sophia* was 15 she was groomed by a man posing as a boy who she was chatting to on social media.
Sophia, now aged 19, said: “He started getting angry if I didn’t reply quick enough or when I wasn’t saying exactly what he wanted to hear. It felt strange, how he was being, so I tried breaking off the conversation with him on Yubo. He just found me on Instagram and moved to messaging me directly there.
“He had started asking for selfies of me, then asking me to take my clothes off and send photos. When he threatened me and started being angry, I was petrified. He used the images to control me. I wasn’t even allowed to use the toilet without his permission. I was afraid to tell anyone because of the photos and his threats. He threatened to share the images of me with friends and family he’d found through my social media if I stopped replying.”
A draft Online Safety Bill was published over two years ago but regulation was first promised by Government in 2018 following the NSPCC’s call for action and the launch of its Wild West Web campaign.
The charity has been campaigning for strong legislation ever since, working closely with survivors, Government, Parliamentarians, and other civil society groups to ensure it effectively tackles the way social media and gaming sites contribute to child sexual abuse.
The legislation will mean tech companies have a legal duty of care for young users and must assess their products for child abuse risks and put mitigations in place to protect children.
It will give the regulator Ofcom powers to address significant abuse taking place in private messaging and require companies to put safeguards in place to identify and disrupt abuse in end-to-end encrypted environments.
The NSPCC said these measures are vital to effectively protect children from the most insidious abuse and recent polling shows they are backed by more than seven in ten voters.
Caroline Cunningham, NSPCC Northern Ireland’s Acting Policy and Public Affairs Manager, said: “Our children’s online safety is under increasing threat and this is profoundly worrying. We need our Government to prioritise this issue and ensure that the 2015 commissioned Online Safety Strategy and Action Plan continues to be fit for purpose.
“We also need to ensure that the Online Safety Bill progressing in Westminster is passed, to give children the protections they need to prevent abuse from happening in the first place.”
As well as winning the commitment to legislate, the NSPCC has helped shape significant gains for children in the Online Safety Bill as it has passed through Parliament, including:
- Senior tech bosses will be held criminally liable for significant failures that put children at risk of sexual abuse and other harm.
- Girls will be given specific protections as Ofcom will produce guidance on tackling Violence Against Women and Girls for companies to follow.
- Companies will have to crack down on so-called tribute pages and breadcrumbing that use legal but often stolen images of children and child accounts to form networks of offenders to facilitate child sexual abuse.
- Sites will have to consider how grooming pathways travel across various social media apps and games and work together to prevent abuse spreading across different platforms.
The NSPCC is still seeking assurances that the legislation will effectively regulate AI and immersive technology and wants an online child safety advocacy body specifically to speak with and for children as part of the day-to-day regulatory regime. They argue that this will help spot emerging risks and fight for the interests and safety of children before tragedies arise.
The charity is asking campaigners to reach out to MPs with personal messages about why they should act to make the online world safer for children and pass a robust Online Safety Bill in the coming weeks.
*Name has been changed to protect identity.