X blocks searches for Taylor Swift after explicit deepfakes go viral

X has disabled Taylor Swift-related searches on its platform in an attempt to curb the spread of fake pornographic images in the likeness of the singer that began circulating on social media last week.

Since last Sunday, searches for “Taylor Swift” on X have returned the error message, “Oops, something went wrong.” X blocked the search term after pledging to remove the deepfake AI-generated images from the platform and take “appropriate actions” against accounts that shared them.

“Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content,” X saidFriday in a post on its official Safety account.

Still, some false images of the pop star continue to circulate the social network, with some bad actors bypassing the search block by manipulating search terms, such as adding words between the entertainer’s first and last name, CBS MoneyWatch observed in a test of X’s internal search engine.

Reached for comment by CBS MoneyWatch, X replied “Busy now, please check back later.”

The deepfake images last week amassed 27 millions viewers and roughly 260,000 likes in 19 hours, NBC News reported. They also landed on other social networks, including Reddit and Facebook.

AI-generated ads using Taylor Swift’s likeness dupe fans with fake Le Creuset giveaway

The images’ massive reach lays bare an increasingly important issue facing tech companies: How to remove deepfakes, or “synthetic media” images, from their platforms. More than 95,000 deepfake videos were disseminated online in 2023, a 550% increase from the number of false videos circulating the internet in 2019, according to cybersecurity firm Home Security Heroes’ latestreport.

Check Also

HELOCs vs. home equity loans: What to consider before rate cuts

Something that hasn’t been done since March 2020 will likely occur next week. The Federal …

Leave a Reply

Your email address will not be published. Required fields are marked *