New York
CNN
—
âAll we have to have is just a human form to be a victim.â Thatâs how lawyer Carrie Goldberg describes the risk of deepfake porn in the age of artificial intelligence.
While revenge porn â or the nonconsensual sharing of sexual images â has been around for nearly as long as the internet, the proliferation of AI tools means that anyone can be targeted by this form of harassment, even if theyâve never taken or sent a nude photo. Artificial intelligence tools can now superimpose a personâs face onto a nude body, or manipulate existing photos to make it look as if a person is not wearing clothes.
In the past year, targets of AI-generated, nonconsensual pornographic images have ranged from prominent women like and Rep. Alexandria Ocasio-Cortez to .
For someone discovering that they, or their child, have been made the subject of deepfake porn, the experience is typically scary and overwhelming, said Goldberg, who runs the New York-based firm C.A. Goldberg Law representing victims of sex crimes and online harassment. âEspecially if theyâre young and they donât know how to cope and the internet is this big, huge, nebulous place,â she said.
But there are steps that targets of this form of harassment can take to protect themselves and places to turn for help, Goldberg told me in an interview on CNNâs new tech podcast, Terms of Service with Clare Duffy.
aims to demystify the new and emerging technologies that listeners encounter in their daily lives. (You can listen to the full conversation with Goldberg .)
Goldberg said that for people targeted by AI-generated sexual images, the first step â however counterintuitive â should be to screenshot them.
âThe knee-jerk reaction is to get this off the internet as soon as possible,â Goldberg said. âBut if you want to be able to have the option of reporting it criminally, you need the evidence.â
Next, they can seek out the forms that platforms like , and provide to request removal of explicit images. Nonprofit organizations like and can also help facilitate the removal of such images across multiple platforms at once, although not all sites cooperate with the groups.
A bipartisan group of senators in August calling on nearly a dozen tech firms, including X and Discord, to join the programs.
The fight to address nonconsensual explicit images and deepfakes has received rare bipartisan support. A group of who had been affected by AI-generated porn testified at a hearing on Capitol Hill, where Republican Sen. Ted Cruz introduced a bill â supported by Democratic Sen. Amy Klobuchar and others â that would make it a crime to publish such images and require social media platforms to remove them upon notice from victims.
But, for now, victims are left to navigate a patchwork of state laws. In some places, there are no criminal laws preventing the creation or sharing of explicit deepfakes of adults. (AI-generated sexual images of children typically fall under child sexual abuse material laws.)
âMy proactive advice is really to the would-be offenders which is just, like, donât be a total scum of the earth and try to steal a personâs image and use it for humiliation,â Goldberg said. âThereâs not much that victims can do to prevent this ⦠We can never be fully safe in a digital society, but itâs kind of up to one another to not be total a**holes.â