Internet Watch Foundation (IWF)’s cover photo
Internet Watch Foundation (IWF)

Internet Watch Foundation (IWF)

Non-profit Organizations

Cambridge, Cambridgeshire 20,560 followers

Leading tech charity working globally to eliminate child sexual abuse images & videos from the internet.

About us

Protecting children is at the heart of everything we do. For over 25 years, since the early days of the internet, our job has been to help child victims of sexual abuse by hunting down and removing any online record of the abuse. How we do this Tech-for-good.We build cutting-edge tech tools designed to make it easier to identify and remove online images and videos of child sexual abuse. In short, tech to protect kids. Our team of human analysts. Tech companies and law enforcement worldwide trust the assessments, experience and knowledge of our extraordinary team of people. Working together. With international partners in government, law enforcement, reporting hotlines, charities and the tech community we work to stop illegal images of children being circulated again and again. We share vital information that could lead to the rescue of a child from terrible abuse. IWF Hotline. This gives people a safe and anonymous place to report suspected online images and videos. When we started in 1996, 18 per cent of child sexual abuse imagery online was hosted in the UK. Today, thanks to our Hotline, it’s less than one per cent. We’re proud of that.

Website
https://www.iwf.org.uk/
Industry
Non-profit Organizations
Company size
51-200 employees
Headquarters
Cambridge, Cambridgeshire
Type
Nonprofit
Founded
1996
Specialties
tech, AI, machine learning, child protection, and online protection

Locations

  • Primary

    Vision Park, Chivers Way

    Histon

    Cambridge, Cambridgeshire CB24 9ZR, GB

    Get directions

Employees at Internet Watch Foundation (IWF)

Updates

  • Teachers… talking to your students about nudes can be nerve-wracking, let alone awkward. But it’s important that we talk frankly with young people about the dangers of sharing images online – with people they know, and people they don’t. We have resources available for use with secondary-aged students and as part of teacher training programmes. Download them at https://lnkd.in/dKGPr-VB

    • No alternative text description for this image
  • Children are reporting online sexual extortion attempts in record numbers in the UK, as we received 394 reports from under-18s last year of blackmail attempts after sending sexual images to predators via Report Remove - a 34% more than in 2024. Report Remove is a crucial self reporting tool run in partnership with NSPCC's Childline that allows children to flag intimate images or videos of themselves that have appeared, or could appear, online, to get the content removed. If a child contacts Report Remove and uploads an intimate image of themselves to the service, it then turns those images into a “hash”, or digital fingerprint, which is shared with leading tech platforms, which can take the image down or prevent it from being uploaded. Report Remove does not share the image itself with any tech companies. The true impact of the latest sexual extortion numbers on children’s lives is “difficult to fathom”, says our CEO Kerry Smith, as some unknown victims might not be aware of the Report Remove service and had not come forward. “Criminals are casting their nets wide and are able to corner young people with the most violent and terrifying threats,” said Kerry. “They employ emotional manipulation and use intimidating, aggressive language and threats that escalate rapidly after nudes are taken.” We support calls to make anti-nudity detection technology mandatory on devices. “It is clear to us that if companies won’t do this by themselves, the government must step in to make sure they do,” said our Head of Policy Hannah Swirsky. https://lnkd.in/eSpbpnfc

  • Every day, our analysts see unidentified child victims. While they work hard to make sure the imagery cannot spread online, they do not always have enough information to identify the children being abused. Here is how one of our world-class analysts, through diligence, determination, and a bit of luck, helped give a name to a previously unknown victim of child sexual abuse – getting her the recognition and support she is entitled to, and giving her the peace of mind that the IWF is working to remove images and videos of her sexual abuse. https://lnkd.in/equyQ-3e

    • No alternative text description for this image
  • Amy was just three years old when IWF became aware of imagery showing her abuse. Even after she was rescued and safeguarded, that abuse continued to be shared online. In one three-month period alone, IWF assessed and hashed an image of Amy on average three times a day. Her story is a stark reminder that every image is a real child, and every removal matters. Learn more and support our vital work to find, remove and disrupt child sexual abuse material online at https://lnkd.in/d5rNBn3m.

    • No alternative text description for this image
  • 🔎 There is growing concern for how AI can be misused to create and share child sexual abuse material (CSAM), referred to as AI-CSAM. In partnership with National Crime Agency (NCA) CEOP Education we’ve created a resource for professionals working with children and young people to raise awareness about the risks of this emerging technology. ➡️ https://lnkd.in/dnkvJCTi #Resource #Professionals #Education #YoungPeople #AI

    • No alternative text description for this image
  • Our latest report reveals the full scale of AI-generated child sexual abuse images and videos and ‘unsettling’ insight into offender views. In 2025, we identified 8,029 AI-generated images and videos of realistic child sexual abuse – a 14% increase in criminal AI content on the previous year. Our Senior Analyst Natalia [not her real name. IWF analysts’ identities are protected] said, “Every new development in generative AI is extolled for its ability to enhance the realism, to heighten the severity, or make more immersive, any conceivable sexual scenario with a child. This could be through adding audio to video, being able to depict multiple people interacting or even being able to successfully manipulate imagery of a real child known to an offender. “We know this affects victims and survivors, as its creation and distribution is just as keenly felt as with traditional forms of child sexual abuse.” Read more at https://lnkd.in/eQ3KshsB

    • No alternative text description for this image
  • 3 weeks to go to the launch of our Annual Data & Insights Report, with key insights into how abuse is created, distributed and monetised - as well as the systemic challenges that make eradication difficult. The report also identifies persistent and emerging harms and explains how we work with our partners to disrupt these systems and reduce harm to children.

  • Whether it's a 5K, a marathon, a mud run or a charity cycle, our incredible fundraising heroes lace up their trainers and go the distance to help us find and remove child sexual abuse imagery online. Every pound raised helps train our analysts, fund their specialist welfare, and keep our Hotline running. Want to turn your next sporting challenge into something extraordinary? 🏃 Browse 750+ events 🎽 We'll support you every step of the way & send you a free IWF t-shirt 💷 Set up your JustGiving page and start fundraising today. Visit https://lnkd.in/evu-xYg3.

    • No alternative text description for this image

Similar pages

Browse jobs