In response to the figures, the NSPCC has urged online regulator Ofcom to strengthen the Online Safety Act.
It said there is currently too much focus on acting after harm has taken place, rather than being proactive to ensure the design of social media platforms does not contribute to abuse.
It said there is currently too much focus on acting after harm has taken place, rather than being proactive to ensure the design of social media platforms does not contribute to abuse.
The charity has also called on the Government to do more to disrupt child sexual abuse in private messages.
Sir Peter Wanless, NSPCC chief executive, said: “One year since the Online Safety Act became law and we are still waiting for tech companies to make their platforms safe for children.
“We need ambitious regulation by Ofcom who must significantly strengthen their current approach to make companies address how their products are being exploited by offenders.
“It is clear that much of this abuse is taking place in private messaging, which is why we also need the Government to strengthen the Online Safety Act to give Ofcom more legal certainty to tackle child sexual abuse on the likes of Snapchat and WhatsApp.”
“It is clear that much of this abuse is taking place in private messaging, which is why we also need the Government to strengthen the Online Safety Act to give Ofcom more legal certainty to tackle child sexual abuse on the likes of Snapchat and WhatsApp.”
Minister for safeguarding and violence against women and girls, Jess Phillips, said: “Child sexual abuse is a vile crime that inflicts long-lasting trauma on victims and the law is clear – the creation, possession and distribution of child sexual abuse images, and grooming a child is illegal.
“I met with law enforcement leads and the NCA (National Crime Agency) only last week to hear about the tremendous work they do to bring these offenders to justice.
“I met with law enforcement leads and the NCA (National Crime Agency) only last week to hear about the tremendous work they do to bring these offenders to justice.
“Social media companies have a responsibility to stop this vile abuse from happening on their platforms.
“Under the Online Safety Act they will have to stop this kind of illegal content being shared on their sites, including on private and encrypted messaging services, or face significant fines.
“The shocking case involving Alexander McCartney, who alone groomed over 3,500 children, demonstrates more clearly than ever that they should act now and not wait for enforcement by the regulator.”
A Snapchat spokesperson said: “Any sexual exploitation of young people is horrific and illegal and we have zero tolerance for it on Snapchat.
“If we identify such activity, or it is reported to us, we remove the content, disable the account, take steps to prevent the offender from creating additional accounts, and report them to the authorities.
“We have extra protections including in-app warnings to make it difficult for teens to be contacted by strangers, and our in-app Family Centre lets parents see who their teens are talking to, and who their friends are.”
An Ofcom spokesperson said: “From December, tech firms will be legally required to start taking action under the Online Safety Act, and they’ll have to do far more to protect children.
“Our draft codes of practice include robust measures that will help prevent grooming by making it harder for perpetrators to contact children.
“Our draft codes of practice include robust measures that will help prevent grooming by making it harder for perpetrators to contact children.
“We’re prepared to use the full extent of our enforcement powers against any companies that come up short when the time comes.”