Why Black women should be worried about AI revenge porn
OPINION: Taylor Swift was recently the victim of explicit deepfake images, but women, people of color and queer folks are the primary targets of these kinds of AI attacks, with Black women bearing the brunt due to their intersectional identities. The post Why Black women should be worried about AI revenge porn appeared first on TheGrio.
OPINION: Taylor Swift was recently the victim of explicit deepfake images, but women, people of color and queer folks are the primary targets of these kinds of AI attacks, with Black women bearing the brunt due to their intersectional identities.
Editor’s note: The following article is an op-ed, and the views expressed are the author’s own. Read more opinions on theGrio.
When Taylor Swift became the victim of AI image-based sexual abuse through nonconsensual deepfakes, it revealed yet another distressing consequence of the relentless pursuit of advanced AI tools. According to a study, the photos likely started on the misogynist 4chan platform before making their way to a mainstream platform like X, where they spread like wildfire. What’s most disturbing is the ease with which users found loopholes in Microsoft Designer, for example, to create these images using openly available techniques developed by AI researchers. These techniques allow anyone to become an abuser, swapping women’s faces into explicit content — one can even download several apps that create AI revenge porn for you.
But what if the victim wasn’t Taylor Swift? Amid discussions on AI, there has been insufficient attention paid to the use of AI to create pornographic content to harass, abuse and push people offline without their permission. Research estimates that somewhere between 90 to 95% of deepfakes online since 2021 were of nonconsensual pornography. Women, people of color and queer folks are the primary targets of these kinds of AI attacks, with Black women bearing the brunt due to their intersectional identities.
Recent incidents of AI revenge porn have targeted female students with virtual rapes, used the likeness of journalists to spread fake news and created deepfake porn of female gamers. In the absence of Swifties’ mass reporting or pressure from the celebrity-focused media, there are often minimal consequences for those responsible or support for the victims. One female gamer even had to pay to have deepfake porn of her removed from sites.
In response to the Swift incident, one CNN columnist wrote, “We are in an era where it’s not just our data that’s up for grabs, it’s our most intimate qualities: Our voices, our faces, our bodies can all now be mimicked by AI.” However, Black women’s voices, faces and, particularly our bodies have always been “up for grabs.” Beginning with the trans-Atlantic slave trade, rape and sexual exploitation became a tool of terror to control Black women’s bodies and reproduction.
This tradition of exploiting Black women’s bodies has been well documented from the torture of enslaved women responsible for the advent of modern gynecology to the use of Henrietta Lack’s cells in modern medicine without her or her family’s knowledge or consent to Black AI influencers designed, controlled and profited by their white creators. In each instance, the humanity of Black women has been erased or ignored in favor of scientific and technological progress that is purported to benefit humanity as a whole.
Furthermore, AI is known to recreate and even exacerbate historical gender and racial biases. In one study, although 34% of American judges are women, only 3% of AI-generated images for the word “judge” were women. Even though 70% of fast-food workers in the United States are white, 70% of the images produced by AI were of dark-skinned people.
Joy Buolamwini’s groundbreaking MIT study showed that facial recognition software was inaccurate 35% of the time for dark-skinned women as opposed to only 0.8% for lighter-skinned/white men. Despite improvements in creating images of Black women, concerns linger among Black artists about AI producing harmful and distorted representations.
While the explicit AI-generated images of Swift were removed following the public outcry, the prevalence of AI image-based sexual abuse through nonconsensual deepfakes intensifies the challenge faced by Black women, who often lack support or protection online. Black women are more likely to be the victims of hate speech and harassment online while also being more likely to be faced with biased moderation on platforms that result in silencing our voices for speaking up about our lived experiences. It is only a matter of time before Black women also become the overwhelming victims of image-based sexual abuse through nonconsensual deepfakes.
Addressing this issue requires more than the moderation of pornographic material on social media platforms. Most are ill-equipped and uninterested in proactively managing the audiovisual manipulation seen in AI deepfakes. Therefore, it is imperative to not only revise revenge porn laws but also keep them updated to address evolving technology.
Although there are laws to combat revenge porn in 48 states and D.C., most require malicious intent. Oftentimes, however, young boys and men create these nonconsensual images to monetize, as a form of male bonding or simply because they don’t see women and girls as human beings. However, the reason behind the creation of AI image-based sexual abuse should be less relevant than the fact that it’s nonconsensual and causes harm to the victim. The laws need to be updated to reflect the impact.
Secondly, laws need to go a step further to explicitly criminalize and prohibit the creation and distribution of image-based sexual abuse at the federal and state levels in order to provide legal recourse for the victims. The bipartisan Preventing Deepfakes of Intimate Images Act is a good start. However, criminalizing an act alone doesn’t necessarily cause people not to engage in it. Therefore, educating young boys and men to help them understand the consequences of this kind of behavior on women, on their future, and on society as a whole would be beneficial to change behaviors in the long term.
Lastly, creating policies without acknowledging who the primary victims are will not address the misogyny, patriarchy and anti-Blackness driving AI image-based sexual abuse and will fail to protect Black women. Black women deserve to feel safe both on and offline.
France François is a social impact leader who thrives at the intersection of tech, politics, and people. Find her online at @franceisacountry.
Never miss a beat: Get our daily stories straight to your inbox with theGrio’s newsletter.
The post Why Black women should be worried about AI revenge porn appeared first on TheGrio.