Facts
Project:
AI Roundtable
Where:
Stockholm, Sweden
Contact:
Susanne Drakborg
susanne.drakborg@childhood.org
Since 2003, Childhood has been active in strengthening and protecting children online by supporting research and methodology development. Currently, anyone can download and distribute large amounts of child sexual abuse material, easily find ways to remain anonymous and easily search and contact persons on the other side of the globe. All of this is used by perpetrators who want to sexually abuse children.
Increase in distribution of documented child abuse
There has been an explosive increase in the distribution and sharing of images and films of documented child abuse. Perpetrators hide in closed forums on the darknet and on encrypted platforms. To access them, advanced new technology is required, including Artificial Intelligence (AI). AI makes it possible to track patterns, go through massive amounts of material in a short time and recognize potential victims and perpetrators on detected images. Childhood is working intensively in this field.
AI solutions are not used fully to prevent abuse
Today, there are AI tools that search the internet for already identified images of child sexual abuse and for identifying victims and perpetrators. However, these efforts are neither sufficient nor coordinated. It is difficult for the police, civil society and researchers to be up to date and understand how different tools work with each other. Additionally, research and methodology development in the AI field have not been used sufficiently to prevent child sexual abuse. This is something we at Childhood seek to change.
Encourage AI-related initiatives
In 2019, Childhood convened an AI roundtable where 60 global experts discussed how AI could be used to protect children online. The conclusions from that meeting guide our continued work to encourage, coordinate and intensify AI related initiatives to combat child sexual abuse. One result of the AI roundtable is the LIBRA project.
Photo: private