SWITCH TO LIGHT MODE
SWITCH TO LIGHT MODE

OUR ORIGINAL SERIES

Latest stories of our network

Internet Watch Foundation (IWF) hunts down child sexual abuse images online and helps identify children involved so that law enforcement can intervene. While the recent pandemic has triggered greater numbers of child abuse images, CEO Susie Hargreaves and her team are fighting back with a new piece of tech.


Defenders of Digital episode one: Internet Watch Foundation 

COVID-19 has fuelled a disturbing increase in child sex abuse material online. Our latest Defenders of Digital series begins by introducing Susie Hargreaves's team at Internet Watch Foundation (IWF) and explores their mission to make children safer. It also looks at how the pandemic has moved the goalposts and the new tech making a difference.

Where it all began 

Formed in 1996 in response to a fast-growing number of online child abuse cases, IWF's 155 members include tech's biggest names, such as Microsoft and Google. They're united by the common goal to rid the internet of child sexual abuse images and videos.

Online child abuse is a growing issue

The pandemic has made the issue of online child sexual abuse material more acute. During lockdown in the UK alone, IWF says 300,000 people were looking at online child sexual abuse images at any one time. What's worse, the material is always changing.

Self-generated content: A dark twist

IWF has recently seen a worrying rise in self-generated sexual abuse material, chiefly among girls age 11 to 13. The victim is groomed or coerced into photographing or filming themselves, which the sexual predator captures and distributes online. In the past year alone, the proportion of online content they're removing that is self-generated has risen from 33 to 40 percent.

New tech making the difference

There are encouraging developments helping IWF with their work. Microsoft's PhotoDNA analyzes known child exploitation images, finds copies elsewhere on the internet, and reports them for removal. It helped IWF remove 132,700 web pages showing child sexual abuse images in 2019. How does it work?

PhotoDNA scours the web for matching images

First, PhotoDNA creates a unique digital fingerprint of a known child abuse image, called a 'hash.' It compares that fingerprint against other hashes across the internet to find copies. It reports copies it finds to the site's host. It's a fast and ingenious way to shut down child exploitation.

Help stop child sexual exploitation: Report abuse images

Internet users who have stumbled across suspected child abuse images and reported them to IWF have been instrumental in starting a process that's led to many children in abusive situations receiving help. If you see an image or video you think may show child sexual exploitation, report it anonymously to IWF.

Every year, the first Friday of March is Employee Appreciation Day. Here are five TED Talks that will boost gratitude and morale at work:

Read More Read Less

Latest Stories

Create Tomorrow

The next industrial revolution looks promising

Smart factories could fight climate change and save lives

When COVID-19 hit, manufacturers worldwide raced to build as many ventilators as possible for patients. But traditional factories throughout the globe couldn't fulfill the demand fast enough. With traditional and automated manufacturing processes still not as efficient as we need, could autonomous factories be the next industrial revolution we've been waiting for?

Read More Read Less
hacker:HUNTER - the series