Google and Bing, the most important search engines in the world, are prioritizing deepfakes non-consensual celebrity porn on their results pages. The main victims are famous women, exposed in nudes or falsified sexual scenes, created with artificial intelligence.
The consultation of general terms such as «deepfakes porn” or “fake nudes” returns millions of results about female actresses, singers and other artists. This happens on both Google and Bing. Some links direct to recognized pornographic portals and others to sites that specialize in generating this type of false content.
NBCNews He did a more thorough exercise: he looked for the word «deepfakes» next to the name of 36 female celebrities. In the main Google results, he found deepfakes non-consensual porn with the image of 34 of these women. Fake photos and videos of 35 of them appeared on Bing. More than half of the top results were links to two web portals that promote themselves as specialized in these materials created with AI.
Sensity, a company based in Amsterdam that is dedicated to detecting false content, already warned in 2019 that 96% of videos deepfakes on the Internet they were pornographic. Most of them had as victims women who did not give their consent. All this violence has now been enhanced with the launch of new artificial intelligence tools.
The deepfakes porn, in full boom of artificial intelligence
The investigation of the NBC reports that searching on Google for “fake nudes” returns several links to applications and programs to create deepfakes for not. These pages appear before the articles that denounce the damage and danger of this trend.
Copilot, Microsoft’s artificial intelligence chatbot, told a question on the topic that “the use of deepfakes “It is unethical and can have serious consequences.”. However, Bing also offers dozens of apps in its featured results to create deepfakes.
Google allows victims to request the removal of this content from search results through a form. But it does not search or delete deepfakes porn proactively. “We only review the URLs that you or your authorized representative submit in the form,” they say on the help page.
The Google Play Store prohibits “apps that promote or perpetuate demonstrably misleading or deceptive images, videos and/or texts.” However, it still hosts FaceMagic, an app which came to be promoted directly as a tool to make porn deepfakes.
Content appearing in Bing’s top image search results includes explicit fake photos of former Disney Channel teen actresses. “Some of the images use photographs of their faces that appeared to have been taken before they turned 18,” the report says. NBC.
What do Google and Microsoft say?
“We understand how distressing this content can be for people affected by it and we are actively working to provide more protection for Search,” a Google spokesperson told the NBC. The technology company explained that, like any search engine, they index all the content that exists on the web. “But we actively design our rating systems to avoid surprising people with unexpectedly explicit or harmful content that they are not looking for,” they added in a statement.
Microsoft clarified in August of last year that it considers the deepfakes porn within its category of non-consensual intimate images (NCII). Like Google, it has a form for victims to report this type of content.
“The distribution of NCII is a serious violation of privacy and personal dignity with devastating effects for victims,” a company spokesperson said. “Microsoft prohibits NCII on our platforms and services, including soliciting NCII or promoting the production or redistribution of intimate images without the victim’s consent.”
The responsibility of stopping this violent practice does not rest solely with the search engines. Pornhub, the largest pornography website in the world, established since 2018 that deepfakes They were non-consensual porn. Since then, this type of material has been banned.. And in Mexico City, for example, a bill is being discussed that seeks to punish anyone who generates false intimate content using artificial intelligence with up to 12 years in prison.