Current:Home > ScamsTaylor Swift deepfakes spread online, sparking outrage -Excel Wealth Summit
Taylor Swift deepfakes spread online, sparking outrage
View
Date:2025-04-13 09:22:35
Pornographic deepfake images of Taylor Swift are circulating online, making the singer the most famous victim of a scourge that tech platforms and anti-abuse groups have struggled to fix.
Sexually explicit and abusive fake images of Swift began circulating widely this week on the social media platform X.
Her ardent fanbase of "Swifties" quickly mobilized, launching a counteroffensive on the platform formerly known as Twitter and a #ProtectTaylorSwift hashtag to flood the social media site with more positive images of the pop star. Some said they were reporting accounts that were sharing the deepfakes.
The Screen Actors Guild released a statement on the issue Friday, calling the images of Swift "upsetting, harmful, and deeply concerning," adding that "the development and dissemination of fake images — especially those of a lewd nature — without someone's consent must be made illegal."
The deepfake-detecting group Reality Defender said it tracked a deluge of nonconsensual pornographic material depicting Swift, particularly on X. Some images also made their way to Meta-owned Facebook and other social media platforms.
"Unfortunately, they spread to millions and millions of users by the time that some of them were taken down," said Mason Allen, Reality Defender's head of growth.
The researchers found at least a couple dozen unique AI-generated images. The most widely shared were football-related, showing a painted or bloodied Swift that objectified her, and in some cases, inflicted violent harm on her deepfake persona.
This comes after earlier this month an AI-generated video featuring Swift's likeness endorsing a fake Le Creuset cookware giveaway also made the rounds online. It was unclear who was behind that scam, and Le Creuset issued an apology to those who may have been duped.
Researchers have said the number of explicit deepfakes have grown in the past few years, as the technology used to produce such images has become more accessible and easier to use. In 2019, a report released by the AI firm DeepTrace Labs showed these images were overwhelmingly weaponized against women. Most of the victims, it said, were Hollywood actors and South Korean K-pop singers.
Brittany Spanos, a senior writer at Rolling Stone who teaches a course on Swift at New York University, says Swift's fans are quick to mobilize in support of the artist, especially those who take their fandom very seriously and in situations of wrongdoing.
"This could be a huge deal if she really does pursue it to court," she said.
When reached for comment on the fake images of Swift, X directed the Associated Press to a post from its safety account that said the company strictly prohibits the sharing of non-consensual nude images on its platform. The company has sharply cut back its content-moderation teams since Elon Musk took over the platform in 2022.
"Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them," the company wrote in the X post early Friday morning. "We're closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed."
Meanwhile, Meta said in a statement that it strongly condemns "the content that has appeared across different internet services" and has worked to remove it.
"We continue to monitor our platforms for this violating content and will take appropriate action as needed," the company said.
A representative for Swift didn't immediately respond to a request for comment Friday.
Allen said researchers are 90% confident that the images were created by diffusion models, which are a type of generative artificial intelligence model that can produce new and photorealistic images from written prompts. The most widely known are Stable Diffusion, Midjourney and OpenAI's DALL-E. Allen's group didn't try to determine the provenance.
Microsoft, which offers an image-generator based partly on DALL-E, said Friday that it was in the process of investigating whether its tool was misused. Much like other commercial AI services, it said it doesn't allow "adult or non-consensual intimate content, and any repeated attempts to produce content that goes against our policies may result in loss of access to the service."
Asked about the Swift deepfakes on "NBC Nightly News," Microsoft CEO Satya Nadella said Friday that there's a lot still to be done in setting AI safeguards and "it behooves us to move fast on this."
"Absolutely this is alarming and terrible, and so therefore yes, we have to act," Nadella said.
Midjourney, OpenAI and Stable Diffusion-maker Stability AI didn't immediately respond to requests for comment.
Federal lawmakers who've introduced bills to put more restrictions or criminalize deepfake porn indicated the incident shows why the U.S. needs to implement better protections.
"For years, women have been victims of non-consensual deepfakes, so what happened to Taylor Swift is more common than most people realize," said Rep. Yvette D. Clarke, a Democrat from New York, who's introduced legislation that would require creators to digitally watermark deepfake content.
Rep. Joe Morelle, another New York Democrat pushing a bill that would criminalize sharing deepfake porn online, said what happened to Swift was disturbing and has become more and more pervasive across the internet.
"The images may be fake, but their impacts are very real," Morelle said in a statement. "Deepfakes are happening every day to women everywhere in our increasingly digital world, and it's time to put a stop to them."
- In:
- Deepfake
- Taylor Swift
- Artificial Intelligence
veryGood! (632)
Related
- Alex Murdaugh’s murder appeal cites biased clerk and prejudicial evidence
- What is a whale native to the North Pacific doing off New England? Climate change could be the key
- South Carolina lawmakers are close to loosening gun laws after long debate
- Lucas Giolito suffers worrisome injury. Will 'pitching panic' push Red Sox into a move?
- 'As foretold in the prophecy': Elon Musk and internet react as Tesla stock hits $420 all
- Sydney Sweeney Proves Her Fashion Rules Are Unwritten With Hair Transformation and Underwear Look
- Former raw milk cheese maker pleads guilty to charges in connection with fatal listeria outbreak
- Savannah Chrisley Shares Mom Julie “Fell Apart” Amid Recent Cancer Scare
- 'Kraven the Hunter' spoilers! Let's dig into that twisty ending, supervillain reveal
- Camila Cabello Reveals the Real Reason Why She Left Fifth Harmony
Ranking
- Nearly 400 USAID contract employees laid off in wake of Trump's 'stop work' order
- Jason Kelce Reveals the Biggest Influence Behind His Retirement Decision
- Kirk Cousins landing spots: The cases for, and against, Vikings, Falcons options
- MLB The Show 24 unveils female player mode ‘Women Pave Their Way’
- Jamie Foxx reps say actor was hit in face by a glass at birthday dinner, needed stitches
- Combined reward in case of missing Wisconsin boy rises to $25,000
- Crop Tops That Are the Perfect Length, According to Enthusiastic Reviewers
- EAGLEEYE COIN: A New Chapter for Cryptocurrencies
Recommendation
Arkansas State Police probe death of woman found after officer
Lucas Giolito suffers worrisome injury. Will 'pitching panic' push Red Sox into a move?
Workplace safety regulator says management failed in fatal shooting by Alec Baldwin
Meta attorneys ask judge to dismiss shareholder suit alleging failure to address human trafficking
Have Dry, Sensitive Skin? You Need To Add These Gentle Skincare Products to Your Routine
Massachusetts art museum workers strike over wages
Michelle Williams from Destiny's Child jokes 'no one recognizes me' in new Uber One ad
Sen. Kyrsten Sinema won't run for reelection in Arizona, opening pivotal Senate seat