The first-of-its-kind study highlights the stark gender disparity in AI-generated non-consensual intimate images and highlights the evolving risks for women in politics and public life.
by barbara rodriguez and Jasmine Mitanifor 19th
More than 20 members of Congress have been victims of sexually explicit deepfakes, and the overwhelming majority of those affected are women, highlighting the stark gender disparity in technology and women’s participation in politics. A new study focuses on the evolving risk of Other forms of citizen participation.
American Sunlight ProjectThe institute, a think tank that studies disinformation and advocates for policies that promote democracy, said on Wednesday that it had recently discovered on a deepfake website that images of 26 members of Congress (25 women and 1 man) were depicted with consent. The study revealed that there were more than 35,000 mentions of intimate images without images. Most of the images were quickly removed as researchers shared their findings with affected members of Congress.
“We need to take some account of this new environment and the fact that the internet has caused so much harm that disproportionately targets women and marginalized communities,” American Sunlight said. said Nina Jankowitz, an online disinformation and harassment expert who founded the project and author of the study.
Non-consensual intimate images. Also commonly known as deepfake porn (Although supporters prefer the former), generation AI or Overlay facial photos on adult performers’ media. There are currently limited policies restricting its creation and spread.
ASP exclusively shared the first-of-its-kind findings with The 19th. The group collected the data by developing a custom search engine that searches for members of the 118th Congress by first name, abbreviation, or nickname on 11 well-known deepfake sites. Neither party affiliation nor geographic location influenced the likelihood of being targeted for abuse, but younger party members were more likely to be victimized. The biggest factor is gender, with female MPs 70 times more likely to be targeted than men.
ASP did not release the name of the lawmaker in the image to avoid prompting searches. They contacted the offices of everyone affected to alert them and provide resources on online harm and mental health support. The study’s authors note that soon after, images of most members were completely or nearly completely removed from the site, a fact they say cannot be explained. The researchers note that such deletions do not prevent the material from being shared or re-uploaded. In some cases involving lawmakers, search results pages remained indexed by Google even though much or all of the content was removed.
“Removal may be accidental. Related to what exactly led to the removal of this content, such as a ‘cease and desist’ letter, copyright infringement claim, or other contact with a site hosting deepfake abuse.” “This highlights a huge disparity in privilege,” the study said. “Even if people, particularly women, who lack the resources available to members of Congress initiate removal requests themselves, the chances of achieving such a rapid response from the creators and distributors of AI-generated NCII are very low.” It will be low.”
The study’s initial findings show that almost 16% of all women currently serving in parliament, or about one in six female MPs, have been victims of non-consensual intimate images generated by AI.
Jankovitz Be a target of online harassment or intimidation For domestic and international efforts to dismantle disinformation. She has also spoken publicly about being a victim of deepfake abuse, which she learned about through a Google Alert in 2023.
“You may be made to appear in these potentially dangerous and intimate situations without your consent, and those videos may not be available to me, even if I pursue a copyright claim against the original poster. “As in the case, those videos spread across the internet without your control.” And without any repercussions for those who amplify or create deepfake porn,” she said. “This continues to be a risk for anyone who stands in public and participates in public discourse, especially women and women of color.”
Image-based sexual abuse can have devastating mental health effects on victims, including non-political members of the public, including children. Last year, there were reports that high school girls were targeted for image-based sexual abuse in the following states: California, new jerseyand pennsylvania. Reactions from school officials vary. The FBI also issued a new warning. It claims that sharing images of such minors is illegal.
Although the full extent of the impact deepfakes will have on society is not yet clear, the study Studies have already shown that 41% of women between the ages of 18 and 29 self-censor to avoid online harassment.
“When almost half of the population remains silent for fear of the harassment they may be subjected to, this is a very powerful threat to democracy and freedom of speech,” said the institute’s research director. , said Sophie Maddox. Media centers at risk at the University of Pennsylvania.
There is no federal law that provides criminal or civil penalties for those who generate and distribute non-consensual intimate AI-generated images. In recent years, about a dozen states have enacted laws.However, most cases involve civil rather than criminal penalties.
Intimate images generated by AI without consent will also be released threat to national security By creating conditions for intimidation and geopolitical concessions. It can have ripple effects on policy makers, whether or not they are the direct targets of the images.
Related article: As AI deepfakes become mainstream, experts warn of impact on elections
“My hope here is that our members recognize that this issue not only affects American women, but it also affects them, and take action. ” Jankowitz said. “It’s also affecting their own colleagues. And this is happening simply because they’re in the public eye.”
Image-based sexual abuse is a unique risk for women running for office. Susannah Gibson narrowly lost her congressional race after a Republican operative shared with The Washington Post a non-consensual recording of a sexually explicit livestream featuring the Virginia Democrat and her husband. . A few months after her death, Gibson told The 19th that she contacted him. Young women are discouraged from running for office For fear that intimate images will be used for harassment. Gibson has since founded a nonprofit organization dedicated to combating image-based sexual abuse. Associated Political Activities Committee Supporting female candidates against invasion of intimate privacy.
Maddox studied how women who speak out in public are more likely to experience digital sexual violence.
“We have a much longer pattern of ‘women should be seen and not heard’ than this, and that makes me think. Mary Beard’s writings and research Based on the idea that being a woman is antithetical to public speech. So when women speak in public, it’s almost like, “I get it.” It’s time to shame them. Time to peel it off. Time to bring them home. It’s time to shame them into silence. ” And that silence and that shameful motive… To understand how this harm manifests itself in relation to female MPs, you have to understand that. ”
ASP is encouraging Congress to pass federal legislation. of Preventing Explicit False Images and Non-Consent Editing Act of 2024This law, also known as the DEFIANCE Act, allows people to sue those who create, share, or receive such images. take it down method It would include criminal liability for such actions and require tech companies to remove deepfakes. Both bills passed the Senate with bipartisan support, but must navigate the House of Representatives over concerns over free speech and the definition of harm, typical hurdles for technology policy.
“It would be a dereliction of Congress to end this session without passing at least one of these bills,” Jankowicz said, adding, “This is a sign that the harms of artificial intelligence are actually being felt by real Americans right now.” “It’s one of the ways we’re being treated.” It’s not a future harm, we don’t need to imagine it. ”
In the absence of Congressional action, the White House collaborated with the private sector Coming up with creative solutions to curb image-based sexual abuse. but Critics are not optimistic About Big Tech’s ability to regulate itself given the history of harm caused by its platforms.
“It’s very easy for perpetrators to create content like this, and the signal is not just directed at the individual women being targeted,” Jankowitz said. “It’s saying to women all over the world, ‘If you take this step and speak out, this may be the consequence you have to deal with.'”
If you are a victim of image-based sexual abuse, Cyber Civil Rights Initiative Manage your list of legal resources.
campaign action