AI-generated nude images of Taylor Swift went viral on X

AI-generated nude images of Taylor Swift went viral on X

Explicit deepfake videos of Taylor Swift went viral on X on Wednesday, accumulating over 27 million views and 260,000 likes within 19 hours before the account responsible for posting the content was suspended.

These deepfakes featured Swift in sexually explicit scenarios, with nude portrayals, and continued to circulate on X despite the initial account suspension. Such content is often generated using AI tools that either create entirely fabricated images or manipulate real images to simulate explicit scenarios.

The source of these deepfakes remains unclear, but a watermark on the images suggests an association with a long-standing website known for publishing fake nude celebrity images, particularly under a section labeled "AI deepfake."

Reality Defender, an AI-detection software company, analyzed the images and indicated a high probability of AI technology being involved in their creation.

The widespread dissemination of these deepfakes for almost a day highlights the concerning trend of AI-generated content and misinformation online. Despite the growing severity of the problem, platforms like X, which have developed their own generative-AI products, have yet to implement or discuss tools to detect and address generative-AI content that violates their guidelines.

The most popular deepfakes featuring Swift depicted her in a nude state within a football stadium. Swift has been subjected to misogynistic attacks for supporting her partner, Kansas City Chiefs player Travis Kelce, by attending NFL games. In an interview with Time, Swift acknowledged the backlash, stating, “I have no awareness of if I’m being shown too much and pissing off a few dads, Brads, and Chads.”

X did not immediately respond to requests for comment. A representative for Swift declined to comment on the record.

While X has banned manipulated media that could harm specific individuals, it has been slow to address the issue of sexually explicit deepfakes on its platform. Earlier in January, a 17-year-old Marvel star spoke out about finding explicit deepfakes of herself on X, with limited success in having them removed. 

As of Thursday, NBC News found similar content still present on the platform. In June 2023, an NBC News review revealed nonconsensual sexually explicit deepfakes of TikTok stars on X. After contacting X for comment, some of the material was removed, but not all.

According to some of Swift’s fans, it wasn't Swift or X responsible for removing the most prominent images of the artist; it resulted from a mass-reporting campaign.

Post a Comment