The UK government has announced that it is funding five projects to snoop on your device content supposedly in a quest to seek out child porn. But surely these technologies will have wider usage. The five projects are the winners of the Safety Tech
Challenge Fund, which aims to encourage the tech industry to find practical solutions to combat child sexual exploitation and abuse online, without impacting people's rights to privacy and data protection in their communications. The winners will
each receive an initial £85,000 from the Fund, which is administered by the Department for Digital, Culture, Media and Sport (DCMS) and the Home Office, to help them bring their technical proposals for new digital tools and applications to combat online
child abuse to the market. Based across the UK and Europe, and in partnership with leading UK universities, the winners of the Safety Tech Challenge Fund are:
- Edinburgh-based Cyan Forensics and Crisp Thinking, in partnership with the University of Edinburgh and Internet Watch Foundation, will develop a plug-in to be integrated within encrypted social platforms. It will detect child sexual abuse material
(CSAM) - by matching content against known illegal material.
- SafeToNet and Anglia Ruskin University will develop a suite of live video-moderation AI technologies that can run on any smart device to prevent the filming of nudity, violence,
pornography and CSAM in real-time, as it is being produced.
- GalaxKey, based in St Albans, will work with Poole-based Image Analyser and Yoti, an age-assurance company, to develop software focusing on user privacy, detection and prevention of
CSAM and predatory behavior, and age verification to detect child sexual abuse before it reaches an E2EE environment, preventing it from being uploaded and shared.
- DragonflAI, based in Edinburgh, will also work with Yoti to combine their
on-device nudity AI detection technology with age assurance technologies to spot new indecent images within E2EE environments.
- T3K-Forensics are based in Austria and will work to implement their AI-based child sexual abuse detection technology
on smartphones to detect newly created material, providing a toolkit that social platforms can integrate with their E2EE services.
|