1 in 6 congressmen is the target of sexually explicit deepfakes generated by AI- Brit Commerce

1 in 6 congressmen is the target of sexually explicit deepfakes generated by AI– Brit Commerce

More than two dozen members of Congress have been victims of sexually explicit deepfakes, and an overwhelming majority of those affected are women, according to a new study that highlights the stark gender disparity in this technology and the changing risks to women’s participation. in politics and other forms of civic engagement.

The American Solar Light Project (ASP)A think tank that investigates misinformation and advocates for policies that promote democracy, released findings Wednesday that identified more than 35,000 mentions of nonconsensual intimate images (NCII) depicting 26 members of Congress (25 women and one man) who They were recently found in deepfake. websites. Most of the images were quickly deleted when researchers shared their findings with affected members of Congress.

“We need to take into account this new environment and the fact that the Internet has opened up a lot of these harms that disproportionately affect women and marginalized communities,” said Nina Jankowicz, an expert on online harassment and misinformation who founded The American Sunlight. Project and is author of the study.

Non-consensual intimate images, also known colloquially as deepfake porn. although defenders prefer the formercan be created through generative AI or through headshot overlay on adult artists media. There is currently limited policy to restrict its creation and spread.

ASP shared the first-of-its-kind findings exclusively with The 19th. The group collected data in part by developing a custom search engine to find members of the 118th Congress by first and last name, and abbreviations or nicknames, on 11 known deepfake sites. Neither party affiliation nor geographic location had an impact on the likelihood of being abused, although younger members were more likely to be abused. The most important factor was gender: female members of Congress were 70 times more likely than men to be attacked.

The ASP did not reveal the names of the legislators who appeared in the images, to avoid encouraging searches. They contacted the offices of all those affected to alert them and offer resources about online harm and mental health support. The study’s authors note that immediately afterward, images directed at most members were completely or almost completely removed from the sites, a fact they cannot explain. Researchers have noted that such deletions do not prevent the material from being shared or uploaded again. In some cases involving lawmakers, search results pages remained indexed on Google even though the content was largely or completely removed.

“The elimination may be a coincidence. Regardless of what exactly led to the removal of this content (whether ‘cease and desist’ letters, claims of copyright infringement, or other contact with the sites hosting deepfake abuse), it highlights a wide disparity of privileges,” according to the study. “People, especially women, who lack the resources afforded to members of Congress, would be highly unlikely to achieve this rapid response from the creators and distributors of AI-generated NCII if they themselves initiated a takedown request.”

According to the study’s initial findings, nearly 16 percent of all women currently serving in Congress (or about 1 in 6 congressmen) are victims of AI-generated non-consensual intimate images.

Jankowicz have been subjected to online harassment and threats for his national and international work dismantling disinformation. He has also spoken publicly about having been a victim of deepfake abuse, a fact he discovered through a Google alert in 2023.

“You can be made to appear in these intimate and compromising situations without your consent, and those videos, even if you say they present a copyright claim against the original author, as in my case, proliferate on the Internet without your consent. control and without any type of consequence for the people who are amplifying or creating deepfake pornography,” he said. “That remains a risk for anyone who is in the public eye, who participates in public discourse, but particularly for women and women of color.”

Image-based sexual abuse can have devastating effects on the mental health of victims, who include ordinary people who are not involved in politics, including children. Last year, there were reports of high school girls being subjected to image-based sexual abuse in states like California, New Jersey and Pennsylvania. School officials have had varying degrees of response, although the The FBI has also issued a new warning. that sharing such images of minors is illegal.

The full impact of deepfakes on society is still coming to light, but investigation It already shows that 41 percent of women aged 18 to 29 self-censor to avoid online harassment.

“It is a hugely powerful threat to democracy and freedom of expression if almost half the population remains silent because they are afraid of the harassment they could face,” said Sophie Maddocks, research director at the Center for Media at Risk at the University of Pennsylvania.

There is no federal law that establishes criminal or civil penalties for anyone who generates and distributes non-consensual AI-generated intimate images. About a dozen states have enacted laws in recent years.although most include civil, not criminal, sanctions.

AI-generated non-consensual intimate images also open threats to national security creating conditions for blackmail and geopolitical concessions. This could have a ripple effect on policymakers, regardless of whether they are directly targeted by the images.

“My hope here is that members will be forced to act when they recognize not only that this is affecting American women, but that it is affecting them,” Jankowicz said. “It is affecting his own colleagues. And this happens simply because they are in the public eye.”

Image-based sexual abuse is a unique risk for women running for public office. Susanna Gibson narrowly lost her competitive legislative race after a Republican operative shared nonconsensual recordings of sexually explicit livestreams featuring the Virginia Democrat and her husband with The Washington Post. In the months following her loss, Gibson told The 19th that she heard from Young women discouraged from running for public office for fear that intimate images will be used to harass them. Gibson has since started a nonprofit organization dedicated to fighting image-based sexual abuse and a accompanying political action committee support candidates against violations of intimate privacy.

Maddocks has studied how women who speak in public are more likely to experience digital sexual violence.

“We have this much longer pattern of ‘women should be seen and not heard’ that makes me think about The writings and research of Mary Beard about this idea that femininity is the antithesis of public discourse. So when women speak in public, it’s almost like, ‘Okay.’ It’s time to shame them. It’s time to undress them. It’s time for them to come home again. It’s time to shame them into silence. And that silencing and that shaming motivation… we have to understand that to understand how this harm manifests itself when it comes to congresswomen.”

The ASP is encouraging Congress to pass federal legislation. He Disrupting Explicit Falsified Images and Nonconsensual Editing Act of 2024also known as the DEFIANCE Act, would allow people to sue anyone who creates, shares or receives such images. The law of tearing it down would include criminal liability for such activity and require technology companies to remove deepfakes. Both bills passed the Senate with bipartisan support, but have to navigate concerns around free speech and definitions of harm, which are typical hurdles for tech policy, in the House.

“It would be a dereliction of duty for Congress to let this session pass without passing at least one of these bills,” Jankowicz said. “It’s one of the ways real Americans are feeling the harm of artificial intelligence right now. It is not future damage. It’s not something we have to imagine.”

In the absence of action from Congress, the White House has collaborated with the private sector devise creative solutions to curb image-based sexual abuse. But critics are not optimistic about Big Tech’s ability to regulate itself, given the history of harm caused by its platforms.

“It’s very easy for perpetrators to create this content, and the signal is not just directed at the individual woman who is being attacked,” Jankowicz said. “It’s for women everywhere, saying, ‘If you take this step, if you raise your voice, that’s a consequence you’re going to have to deal with.’”

If you have been a victim of image-based sexual abuse, the Cyber ​​Civil Rights Initiative maintains a list of legal resources.

This article was originally posted on The Markup and was republished under Creative Commons Attribution-NonCommercial-NoDerivatives license.

Leave a Comment

Your email address will not be published. Required fields are marked *

Shopping Cart
Scroll to Top