Search engines DuckDuckGo, Bing, and Google are facing criticism for including nonconsensual artificial intelligence deepfake pornography at the top of some search results.
The process of making explicit deepfake images involves taking original sexual content and replacing the actor’s or actress’s face with a likely actual person, such as a celebrity.
Tests utilizing a combination of a name and the term “deepfakes” on a sample of thirty-six female celebrities by NBC News revealed that Bing and Google nearly always displayed nonconsensual deepfake videos and photos at the top of search results. Of these, 34 top-ranked results were retrieved by Google, and 35 by Bing. Although NBC News noted that DuckDuckGo has comparable problems, it did not state how serious the issue is.
Additionally, when searching for the keyword “fake nudes,” NBC News claims that the pertinent articles regarding the problem of nonconsensual deepfake pornography only appear after the offensive content at the top.
This is only one of many instances where AI is being utilized maliciously or unethically. Microsoft is among the businesses attempting to create a deepfake detection tool to stop technological abuse in front of the elections.
In a related effort, the FTC is looking for practical methods to identify artificial intelligence-generated phoney audio recordings. Submissions for the Voice Cloning Challenge are due today, with a $25,000 grand prize up for grabs for the winner.