A 15-year-old girl from Aledo, Texas, became a victim of AI-generated deepfake pornography when a classmate altered her Instagram photo and shared the manipulated image on Snapchat. Elliston Berry’s mother, Anna McAdams, recalled the morning after Homecoming when her distraught daughter came to her in tears, sharing the ordeal that left the fake image circulating online for nine months. The incident highlights a growing issue, with over 21,000 deepfake pornographic videos reported last year, a staggering 460% increase compared to the previous year.
In response to similar cases, the San Francisco City Attorney’s office has filed a lawsuit against 16 websites that specialize in creating AI-generated nude images. The sites reportedly garnered 200 million visits in six months. City Attorney David Chiu emphasized the widespread nature of the issue, noting that there are at least 90 such websites, while Chief Deputy City Attorney Yvonne Mere labeled the act as “sexual abuse.” Meanwhile, bipartisan efforts in Congress, including the Take It Down Act sponsored by Sen. Ted Cruz and Sen. Amy Klobuchar, aim to hold tech platforms accountable for removing non-consensual, AI-manipulated content promptly.
The bill recently passed the Senate and is awaiting a House vote. Platforms like Snapchat have reiterated their zero-tolerance policies but continue to face criticism from affected families. Elliston, focusing on advocacy, expressed her determination to prevent similar incidents: “I can’t undo what happened, but I can work to stop this from happening to others.”
Pic Courtesy: google/ images are subject to copyright