The White House on Friday (January 26, 2024) called the circulation of fake AI-generated explicit images of popular singer Taylor Swift on social media platforms ‘alarming’.
White House press secretary Karine Jean-Pierre said: “The circulation of false images of Taylor Swift are alarming.”
She said the government will continue to work to find a solution to the problem.
“We know that incidences like this disproportionately impact women and girls. @POTUS is committed to ensuring we reduce the risk of fake AI images through executive action. The work to find real solutions will continue,” she said.
Reactions
US Representative Joe Morelle said the spread of AI-generated explicit pictures on social media was ‘appalling’.
He posted on X, “The spread of AI-generated explicit images of Taylor Swift is appalling—and sadly, it’s happening to women everywhere, every day.”
“It’s sexual exploitation, and I’m fighting to make it a federal crime with my legislation: the Preventing Deepfakes of Intimate Images Act,” he said.
Democratic Rep Yvette D Clarke posted on X: ” What’s happened to Taylor Swift is nothing new. For yrs, women have been targets of deepfakes w/o their consent. And w/ advancements in AI, creating deepfakes is easier & cheaper.”
“This is an issue both sides of the aisle & even Swifties should be able to come together to solve,” she said.
SAG-AFTRA expresses deep concern
Meanwhile, SAG-AFTRA, a labour union representing performers and broadcasters, described the images as ‘deeply concerning’.
“The sexually explicit, A.I.-generated images depicting Taylor Swift are upsetting, harmful, and deeply concerning,” the union posted on X.
What is deepfake?
Deepfakes are a synthetic media created by machine-learning algorithms named for the deep-learning methods used in the creation process and the fake events they depict, according to Discover Data Science website.
The term reportedly gained popularity in 2017 after a Reddit user by the name of “deepfake” created a subreddit and even started posting videos that used face-swapping technology to insert celebrities’ likenesses into already existing pornographic videos.