Sexual grooming crimes in Yorkshire and the Humber up by nearly 60 per cent in two years with record numbers of children targeted on Instagram.
A total of 473 offences of sexual communication with a child were recorded in the region in the year to April 2019 compared with 297 in the previous year, police figures show.
Across England and Wales there were 4,373 offences of sexual communication with a child recorded in the year to April 2019 compared with 3,217 in the previous year. The offence came into force in April 2017, following a campaign by children’s charity the NSPCC, which requested the police data under the Freedom of Information Act.
More than one in five offences were against children aged 11 or under, where the age of the child was known.
The NSPCC said it was crucial that Boris Johnson’s Government makes a public commitment to draw up online harms laws and implement robust regulation for tech firms to force them to protect children as a matter of urgency.
Peter Wanless, NSPCC Chief Executive, said: “It’s now clearer than ever that Government has no time to lose in getting tough on these tech firms.
“Despite the huge amount of pressure that social networks have come under to put basic protections in place, children are being groomed and abused on their platforms every single day. These figures are yet more evidence that social networks simply won’t act unless they are forced to by law. The Government needs to stand firm and bring in regulation without delay.”
Police figures show instances of grooming on Instagram, which is owned by Facebook, have doubled.
Overall in the last two years, Facebook-owned apps (Facebook, Messenger, Instagram, WhatsApp) and Snapchat were used in nearly 75 per cent of the instances where police in Yorkshire and the Humber recorded and provided the communication method.
The Government has indicated it will publish a draft Online Harms Bill early next year, following the NSPCC’s Wild West Web campaign. The proposals would introduce independent regulation of social networks, with tough sanctions if they fail to keep children safe on their platforms.
A Facebook spokesperson said: “There is no place for grooming or child exploitation on our platforms and we use technology to proactively find and quickly remove it. We also investigate reports from the community with a content and security team of over 30,000 people who respond to reports 24/7.”
The company added that it has a team of 30,000 people in its safety and security team and 99 per cent of child nudity content is detected by technology and removed automatically.
The government has been approached for comment.