SWITCH TO LIGHT MODE
SWITCH TO LIGHT MODE

OUR ORIGINAL SERIES

Latest stories of our network

Internet Watch Foundation (IWF) hunts down child sexual abuse images online and helps identify children involved so that law enforcement can intervene. While the recent pandemic has triggered greater numbers of child abuse images, CEO Susie Hargreaves and her team are fighting back with a new piece of tech.


Defenders of Digital episode one: Internet Watch Foundation 

COVID-19 has fuelled a disturbing increase in child sex abuse material online. Our latest Defenders of Digital series begins by introducing Susie Hargreaves's team at Internet Watch Foundation (IWF) and explores their mission to make children safer. It also looks at how the pandemic has moved the goalposts and the new tech making a difference.

Where it all began 

Formed in 1996 in response to a fast-growing number of online child abuse cases, IWF's 155 members include tech's biggest names, such as Microsoft and Google. They're united by the common goal to rid the internet of child sexual abuse images and videos.

Online child abuse is a growing issue

The pandemic has made the issue of online child sexual abuse material more acute. During lockdown in the UK alone, IWF says 300,000 people were looking at online child sexual abuse images at any one time. What's worse, the material is always changing.

Self-generated content: A dark twist

IWF has recently seen a worrying rise in self-generated sexual abuse material, chiefly among girls age 11 to 13. The victim is groomed or coerced into photographing or filming themselves, which the sexual predator captures and distributes online. In the past year alone, the proportion of online content they're removing that is self-generated has risen from 33 to 40 percent.

New tech making the difference

There are encouraging developments helping IWF with their work. Microsoft's PhotoDNA analyzes known child exploitation images, finds copies elsewhere on the internet, and reports them for removal. It helped IWF remove 132,700 web pages showing child sexual abuse images in 2019. How does it work?

PhotoDNA scours the web for matching images

First, PhotoDNA creates a unique digital fingerprint of a known child abuse image, called a 'hash.' It compares that fingerprint against other hashes across the internet to find copies. It reports copies it finds to the site's host. It's a fast and ingenious way to shut down child exploitation.

Help stop child sexual exploitation: Report abuse images

Internet users who have stumbled across suspected child abuse images and reported them to IWF have been instrumental in starting a process that's led to many children in abusive situations receiving help. If you see an image or video you think may show child sexual exploitation, report it anonymously to IWF.

Create Tomorrow

4 Star Wars-inspired films for May the 4th

Lives today are linked to the world of Star Wars more than many realize

Today is Star Wars Day! What better way to celebrate the iconic movies than by checking out these 4 documentaries that show how Star Wars technology is becoming a daily reality. These short films explore the amazing possibilities of this moment in robotics, cryonics and human augmentation.

Read More Read Less

Latest Stories

Create Tomorrow

Gamers against the clock: Speedrunning esports

Ultra-fast gaming and the sports of tomorrow, with Break the Record's Fredrik Lidholt

Completing a game more quickly than opponents is the goal of the esport of speedrunning. It could be Super Mario, Doom or any other game. This week we'll see which elite players can break the speed record playing Minecraft.

Read More Read Less
hacker:HUNTER - the series