Carrie Goldberg, a lawyer specializing in online harassment, explains that anyone can become a victim of deepfake porn, simply by having a human form. While revenge porn has existed for years, the rise of AI tools has made it possible for anyone to be targeted, even without ever sharing intimate photos. AI technology can superimpose a person’s face onto explicit images or alter existing photos to create false nude depictions. Recent victims have ranged from public figures like Taylor Swift and Rep. Alexandria Ocasio-Cortez to high school students.
For victims, the experience can be traumatic, especially for younger individuals who may not know how to navigate the vastness of the internet. Goldberg, who represents victims of such harassment, stresses the importance of taking action. The first step for those targeted is to take a screenshot of the image, even though the instinct may be to remove it immediately. This preserves evidence, which is essential if the victim wishes to pursue legal action.
Victims can also request the removal of these images through platforms like Google, Meta, and Snapchat, using their designated forms. Organizations such as StopNCII.org and Take It Down can help remove such content from multiple platforms, though not all sites cooperate. In August, a bipartisan group of U.S. senators urged tech companies like X and Discord to join these initiatives.
While there is growing bipartisan support to address deepfake porn, including proposed legislation to criminalize it, victims currently face a patchwork of state laws. In some areas, there are no laws protecting adults from the creation or distribution of explicit deepfakes. AI-generated sexual images of children are typically covered under child sexual abuse material laws.
Goldberg advises potential offenders to avoid using someone’s image for harm, as there are limited ways for victims to prevent such acts. While it’s impossible to be completely safe in a digital society, Goldberg suggests that people should show more responsibility and decency to help curb this type of harassment.