X blocks search for Taylor Swift over deepfake pictures

Alex Omenye
Alex Omenye

In response to a recent surge in the dissemination of graphic AI-generated fakes featuring the globally renowned recording artist Taylor Swift, X has implemented measures to block searches for her on their platform, formerly known as Twitter.

Attempts to search for terms like “Taylor Swift” or “Taylor Swift AI” on X may result in a “Something went wrong” message, as confirmed by X’s Head of Business, Joe Benarroch.

This temporary measure is aimed at prioritizing user safety, according to a statement reported by The Wall Street Journal.

X subsequently issued a statement acknowledging the situation, declaring its active removal of all identified AI-generated images and taking actions against the accounts responsible for posting them.

The platform explicitly prohibits non-consensual nudity and the dissemination of synthetic and manipulated media.

Meta, the parent company of X, also appears to be addressing the issue on its platforms Threads and Instagram. While suggesting “Taylor Swift AI” when users begin typing “Taylor” into the search boxes, neither platform currently displays results. Instead, a message states that the term “is sometimes associated with activities of dangerous organizations and individuals.”

In response to these developments, reports suggest that Taylor Swift is contemplating legal action against the websites hosting the images, predominantly created using Microsoft Designer.

Microsoft CEO Satya Nadella expressed concern about these deepfakes, deeming them “alarming and terrible” and emphasizing the need for AI companies to swiftly implement stronger safeguards.

White House press secretary Karine Jean-Pierre has also urged Congress to enact legislation protecting individuals from the dissemination of deepfake porn.


TAGGED:
Share this Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *