Rethinking the lifecycle of AI when it comes to deepfakes and kids
Podcast:Marketplace Tech Published On: Mon May 06 2024 Description: The following content may be disturbing to some listeners. For years, child sexual abuse material was mostly distributed by mail. Authorities used investigative techniques to stem its spread. That got a lot harder when the internet came along. And AI has supercharged the problem. “Those 750,000 predators that are online at any given time looking to connect with minor[s] … they just need to find a picture of a child and use the AI to generate child sexual abuse materials and superimpose these faces on something that is inappropriate,” says child safety advocate and TikTokker Tiana Sharifi. The nonprofit Thorn has created new design principles aimed at fighting child sexual abuse. Rebecca Portnoff, the organization’s vice president of data science, says tech companies need to develop better technology to detect AI-generated images and commit not to use this material to train AI models.