Home » Entertainment » AI-Generated Pornographic Images of Taylor Swift Spark Concerns Over Misuse of Artificial Intelligence

AI-Generated Pornographic Images of Taylor Swift Spark Concerns Over Misuse of Artificial Intelligence

(CNN)— AI-generated pornographic images of the world’s most famous star spread across social media this week, underscoring the harmful potential of mainstream AI technology: its ability to create convincingly real and harmful images.

The fake photos of Taylor Swift were mostly circulated on the social media site X, formerly Twitter. The photos – which show the singer in sexually suggestive positions – were viewed tens of millions of times before they were removed from social media platforms. But nothing will disappear online forever, and they will undoubtedly continue to be shared on other, less regulated channels.

A Swift spokesperson did not respond to a request for comment.

Like most major social media platforms, X’s policies prohibit sharing “manipulated or out-of-context synthetic media that may deceive or confuse people and lead to harm.”

The company did not respond to CNN’s request for comment.

It comes as the United States heads into a presidential election year, and concerns are growing about how misleading images and videos generated by artificial intelligence will be used to lead disinformation efforts and ultimately disrupt voting.

“This is a prime example of the ways in which AI is being unleashed for a variety of nefarious reasons without adequate guardrails to protect the public square,” Ben Decker, who runs Mimtica, a cyber threat analysis agency, told CNN.

He added that the exploitation of generative AI tools to create potentially harmful content targeting all types of public figures is rapidly increasing and spreading faster than ever across social media.

“Social media companies don’t necessarily have effective plans to monitor content,” he continued.

For example, Company

Meta has also made cuts to its teams that address disinformation, trolling campaigns and coordinated harassment on its platforms, people with direct knowledge of the situation told CNN, raising concerns ahead of pivotal 2024 elections in the US and around the world.

It is not clear where the images related to Taylor Swift originated. Although some of the images were found on sites like Instagram and Reddit, it was a widespread problem on X in particular.

The incident also coincides with the emergence of AI generation tools such as ChatGPT and Dall-E. However, there is also a much broader world of unsupervised and insecure AI models in open source platforms, Decker said.

He added: “This points to a larger kind of rift in content moderation and platform governance because if all the stakeholders – AI companies, social media companies, regulators and civil society – are not talking about the same things and on the same page about how to address this, then this kind of… The content will continue to spread.”

However, Decker said targeting Swift could draw more attention to the growing issues around AI-generated images. Swift’s massive fan following expressed their outrage on social media this week, bringing the issue to the forefront.

In 2022, Ticketmaster’s collapse prior to its Eras tour sparked online outrage, leading to several legislative efforts to eliminate consumer-unfriendly ticketing policies.

Decker believes the same may apply to the destruction of images produced by artificial intelligence.

“When you have personalities like Taylor Swift that big,” he said [مستهدفة]“Maybe this is what prompts lawmakers and tech companies to take action.” He added, “I think they need to make her feel better because she probably has more influence than almost anyone else on the Internet.”

This type of technology has been used to create what’s known as “revenge porn” — posting explicit images of someone online without their consent — for a while now, but it’s getting renewed attention now due to offensive images of Swift.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.