1 in 6 congresswomen targeted by sexually explicit deepfakes generated by artificial intelligence

Photo of author

By [email protected]


More than two dozen members of Congress have been victims of sexually explicit deepfakes — and the overwhelming majority of those affected are women, according to a new study that highlights the stark gender disparity in this technology and the evolving risks of women’s participation in politics and policy. Other forms of civic engagement.

American Sunlight Project (ASP)a think tank that researches misinformation and advocates for policies that advance democracy, released findings Wednesday that identified more than 35,000 mentions of non-consensual intimate images (NCII) depicting 26 members of Congress — 25 women and one man — that were recently found on Deepfake. Websites. Most of the images were quickly removed as researchers shared their findings with affected members of Congress.

“We need to take into account this new environment and the fact that the Internet has opened up a lot of these harms that disproportionately target women and marginalized communities,” said Nina Yankovic, an expert on misinformation and online harassment who founded The American Sunlight. Project and is the author of the study.

Non-consensual intimate images, also known colloquially as deep porn Although defenders prefer the formercan be created through generative AI or by Overlaying headshots on adult performers’ media. There is currently limited policy to restrict its creation and spread.

ASP shared the first-ever results exclusively with The 19th. The group collected the data in part by developing a custom search engine to find the 118 members of Congress by first and last name, abbreviations or nicknames, on 11 known deepfake websites. Party affiliation or geographic location had no effect on the likelihood of being targeted for abuse, although younger members were more likely to be victimized. The biggest factor was gender, with women being 70 times more likely to target members of Congress than men.

ASP did not reveal the names of the lawmakers depicted in the photos to avoid encouraging searches. They have contacted the offices of all those affected to alert them and offer resources about online harms and mental health support. The study authors noted that in the immediate aftermath, images targeting most members were completely or almost completely removed from the sites — a fact they could not explain. The researchers note that these removals do not prevent the material from being shared or uploaded again. In some cases involving lawmakers, search results pages remained indexed on Google even though the content was largely or completely removed.

“The removal may have been purely coincidental. Regardless of what exactly led to the removal of this content — whether it was ‘cease and desist’ letters, claims of copyright infringement, or any other connection with sites hosting deepfake abuse — it highlights the disparity Big on perks, according to the study. “People, especially women, who lack the resources available to members of Congress are less likely to achieve this rapid response from the creators and distributors of AI-generated NCII if they initiate the takedown request themselves.”

According to the study’s preliminary findings, nearly 16% of all women currently serving in Congress — or about 1 in 6 congresswomen — are victims of non-consensual AI-generated intimate images.

Yankovic He was the target of online harassment and threats For its local and international work in dismantling misinformation. She has also spoken publicly about being a victim of deepfake abuse, a fact she discovered through a Google alert in 2023.

“You can be forced to appear in these intimate, penetrative situations without your consent, and these videos, so to speak, seek a copyright claim against the original poster and – as in my case – spread across the Internet without your knowledge.” “It’s under control and without any kind of consequences for the people who are amplifying or creating deep porn,” she said. “This continues to be a risk for anyone who is in the public eye, participating in public discourse, but especially for women and women of color.”

Image-based sexual assault can have devastating effects on the mental health of victims, including ordinary people who are not politically involved – including children. Last year, there were reports of high school girls being targeted for sexual assault based on photos in states like ca, New Jersey and Pennsylvania. School officials have had varying degrees of response, though The FBI also issued a new warning Sharing such photos of minors is illegal.

The full impact of deepfakes on society remains in focus, however research It actually shows that 41 percent of women ages 18 to 29 self-censor to avoid online harassment.

“This is a very powerful threat to democracy and freedom of expression, if we have almost half the population silencing themselves because they are afraid of the harassment they might face,” said Sophie Maddox, director of research at the institute. The media center is in danger At the University of Pennsylvania.

There is no federal law providing criminal or civil penalties for anyone who creates and distributes intimate images generated by artificial intelligence. About a dozen states have enacted laws in recent yearsAlthough most involve civil penalties, not criminal ones.

Non-consensual intimate images generated by AI are also unlocked Threats to national security By creating conditions for blackmail and geopolitical concessions. This can have ripple effects on policymakers regardless of whether they are the direct target of the images.

“My hope here is that members will be moved to action when they realize that it not only affects American women, but it affects them as well,” Yankovic said. “It affects their colleagues. This happens simply because they are in the public eye.”

Sexual assault poses a unique risk for women running for office. Susannah Gibson narrowly lost her competitive legislative race after a Republican activist shared non-consensual recordings of a sexually explicit live broadcast featuring the Virginia Democratic lawmaker and her husband with The Washington Post. In the months after her loss, Gibson told The 19 she heard about it Preventing young women from running for office For fear that intimate images will be used to harass them. Gibson has since created a non-profit organization dedicated to combating sexual assault and sexual abuse Accompanying Political Action Committee To support female candidates against intimate privacy violations.

Maddox studied how women who speak out are more likely to experience digital sexual violence.

“For much longer, we have had this ‘women should be seen and not heard’ pattern that makes me think about it Writings and research by Mary Beard On this idea that femininity contradicts public discourse. So when women speak out, it’s almost like, “Okay.” It’s time to embarrass them. It’s time to strip them. It’s time to bring them home. It is time to subjugate them and force them to silence. And this silencing and this shameful motive…we have to understand that in order to understand how this harm manifests itself in relation to congresswomen.

ASP encourages Congress to pass federal legislation. the Disabling the Explicit Fraud Act and Nonconsensual Amendments Act of 2024also known as the DEFIANCE Act, would allow people to sue anyone who creates, shares or receives such images. Take it law It would include criminal liability for such activity and require tech companies to remove deepfakes. Both bills passed the Senate with bipartisan support, but must overcome concerns about free speech and definitions of harm, which are typical obstacles to technology policy, in the House.

“It would be a dereliction of Congress’s duty to allow this session to end without passing at least one of these bills,” Yankovich said. “It’s one way that real Americans are now feeling the hurt of AI.” It’s not future damage. It’s not something we have to imagine.”

In the absence of action from Congress, the White House did so It cooperated with the private sector To envision innovative solutions to reduce sexual assault images. but Critics are not optimistic About the ability of big tech companies to regulate themselves, given the history of harm caused by their platforms.

“It’s very easy for perpetrators to create this content, and it’s not just the targeted woman,” Yankovic said. “It’s directed at women everywhere, saying: If you take this step, if you speak up, this is an outcome that you may have to deal with.”

If you are a victim of image-based sexual assault,… Cyber ​​Civil Rights Initiative Maintains a list of legal resources.

This was the article Originally published on The Markup It is republished below Creative Commons Attribution – Non-Commercial – No Derivatives license.



https://gizmodo.com/app/uploads/2024/12/GettyImages-1754907256.jpg

Source link

Leave a Comment