Alexandria Ocasio-Cortez ‘shocked’ by porn deepfake in her likeness

[ad_1]

Alexandria Ocasio-Cortez has demanded a crackdown on AI after viewing a deepfake pornographic image in her likeness performing a sex act. 

The New York congresswoman, 34, was “shocked” when she spotted the digitally altered image on X, the social media site. 

Ms Ocasio-Cortez has repeatedly been targeted with manipulated images and fake social media posts in her likeness, since being elected in 2018 as the youngest woman to serve in Congress.

However, new AI tools have made the creation process much easier and the advanced technology makes images and videos seem more realistic.

As a result, Ms Ocasio-Cortez has been involved in crafting a new law intended to bring an end to non-consensual, sexually explicit deepfakes. It would enable victims to take legal action against the producers and distributors of the content.

‘Get this off my screen’

Talking about her own experience with the porn deepfake in February, she told Rolling Stone magazine her first thought was: “I need to get this off my screen.”

“There’s a shock to seeing images of yourself that someone could think are real,” she said. “As a survivor of physical sexual assault, it adds a level of dysregulation. It resurfaces trauma.

“It’s not as imaginary as people want to make it seem. It has real, real effects not just on the people that are victimised by it, but on the people who see it and consume it. Once you’ve seen it, you’ve seen it. 

“It parallels the same exact intention of physical rape and sexual assault, which is about power, domination, and humiliation. Deepfakes are absolutely a way of digitising violent humiliation against other people.”

AI images of Taylor Swift

Taylor Swift suffered a similar experience in January, when sexually explicit AI-generated images of the singer surfaced on social media.

Fabricated pornography accounted for 98 per cent of all deepfake videos posted online, according to a 2023 study by Home Security Heroes, a cyber security firm.

The proposed US law change, which has bipartisan support, would amend the Violence Against Women Act so that people can sue those who produce, distribute or receive the deepfake pornography, if they “knew or recklessly disregarded” that the victim did not consent to those images.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *