Source:IWF Internet Watch Foundation published on this site Friday 18 June 2021 by Jill Powell
A specialised new team will take ‘digital fingerprints’ of millions of images so companies and organisations around the world can spot them and have them removed.
A “vital” new taskforce will assess and grade millions of the most severe images and videos of child rape and sexual torture as analysts see record numbers of reports of illegal content.
The new team has been set up by the Internet Watch Foundation (IWF), the UK-based charity working internationally to identify and remove images and videos of child sexual abuse from the internet.
The analysts in this team will view, hash (create a digital fingerprint), and classify two million Category A and B images from the UK Government’s Child Abuse Image Database (CAID).
They will then distribute the hashes globally to tech companies, allowing them to be blocked or removed should anyone attempt to share them anywhere in the world.
Category A images involve penetrative sexual activity and sexual activity with an animal or sadism, while Category B images involve non-penetrative sexual activity.
The IWF is the only non-law enforcement body with access to CAID. The work will boost the UK’s contribution to global efforts to stop the distribution of child sexual abuse images on the internet and help to keep the internet a safer place for all.
Hashing an image or video is a process which produces a unique code like a “digital fingerprint” so that it can be recognised and dealt with quickly by the IWF or its partners in the future.
The work will enable tech companies to take swift action to prevent the spread of this abusive material, giving peace of mind to victims who often live with the knowledge footage of their abuse could be being shared by criminals around the world.