We use cookies to make your experience of our websites better. By using and further navigating this website you accept this. Detailed information about the use of cookies on this website is available by clicking on more information. If you reject, you will be taken back to the site you came from.
OUR ORIGINAL SERIES
Latest stories of our network
The people fighting online child exploitation, one image at a time
Meet Susie Hargreaves and her team.
Internet Watch Foundation (IWF) hunts down child sexual abuse images online and helps identify children involved so that law enforcement can intervene. While the recent pandemic has triggered greater numbers of child abuse images, CEO Susie Hargreaves and her team are fighting back with a new piece of tech.
Defenders of Digital episode one: Internet Watch FoundationĀ
COVID-19 has fuelled a disturbing increase in child sex abuse material online. Our latest Defenders of Digital series begins by introducing Susie Hargreaves's team at Internet Watch Foundation (IWF) and explores their mission to make children safer. It also looks at how the pandemic has moved the goalposts and the new tech making a difference.
Where it all beganĀ
Formed in 1996 in response to a fast-growing number of online child abuse cases, IWF's 155 members include tech's biggest names, such as Microsoft and Google. They're united by the common goal to rid the internet of child sexual abuse images and videos.
Online child abuse is a growing issue
The pandemic has made the issue of online child sexual abuse material more acute. During lockdown in the UK alone, IWF says 300,000 people were looking at online child sexual abuse images at any one time. What's worse, the material is always changing.
Self-generated content: A dark twist
IWF has recently seen a worrying rise in self-generated sexual abuse material, chiefly among girls age 11 to 13. The victim is groomed or coerced into photographing or filming themselves, which the sexual predator captures and distributes online. In the past year alone, the proportion of online content they're removing that is self-generated has risen from 33 to 40 percent.
New tech making the difference
There are encouraging developments helping IWF with their work. Microsoft's PhotoDNA analyzes known child exploitation images, finds copies elsewhere on the internet, and reports them for removal. It helped IWF remove 132,700 web pages showing child sexual abuse images in 2019. How does it work?
PhotoDNA scours the web for matching images
First, PhotoDNA creates a unique digital fingerprint of a known child abuse image, called a 'hash.' It compares that fingerprint against other hashes across the internet to find copies. It reports copies it finds to the site's host. It's a fast and ingenious way to shut down child exploitation.
Help stop child sexual exploitation: Report abuse images
Internet users who have stumbled across suspected child abuse images and reported them to IWF have been instrumental in starting a process that's led to many children in abusive situations receiving help. If you see an image or video you think may show child sexual exploitation, report it anonymously to IWF.
- Tomorrow Unlocked > Guardians ›
- Tomorrow Unlocked > Defenders of Digital ›
- Tomorrow Unlocked > Fighting for cell phone privacy against police surveillance ›