San Francisco, January 18: Popular search engines like Google and Microsoft's Bing are reportedly showing non-consensual deepfake porn at the top of search results alongside tools that advertise the ability to create such content. In non-consensual deepfake pornography, someone's likeness is used to create the impression that they are engaged in a sexual act by manipulating images digitally.
In an analysis by NBC News, it was discovered that when searching for various women's names together with the word "deepfakes," as well as more general terms like "deepfake porn" or "fake nudes," Google and other major search engines returned deepfake pornographic photos with the likenesses of female celebrities as the first results. Facebook and Instagram Are Two Most Privacy-Invasive Apps out There: Report.
The researchers used Google and Bing to search for 36 well-known female celebrities using a mix of their names and the term "deepfakes". Upon reviewing the results, it was discovered that the top Google and top Bing results for 34 and 35 of those queries, respectively, contained links to deepfake videos and non-consensual deepfake photos. Over half of the top results were links to a popular deepfake website or a competitor, the report mentioned.
Searching for "fake nudes" showed links to numerous applications and programmes to create and observe nonconsensual deepfake pornography. These links were among the first six results on Google. In a search for "fake nudes" on Bing, a number of nonconsensual deepfake websites and tools appeared. The report found that users could view and create this type of pornographic content before any news reports explained the harmful nature of non-consensual deepfakes.
In Google and Bing, victims of deepfake porn can request the removal of the content by filling out a form. The report did draw attention to the fact that search engines such as Google and Microsoft's Bing don't seem to be regularly monitoring their search engines for misuse. Instagram New Feature Update: Meta-Owned Photo and Video Sharing Platform’s Feature ‘Nighttime Nudges’ To Remind Teens When They Spend Over 10 Minutes on App Late at Night.
"We understand how distressing this content can be for people affected by it, and we’re actively working to bring more protections to Search," a Google spokesperson was quoted as saying. "Like any search engine, Google indexes content that exists on the web, but we actively design our ranking systems to avoid shocking people with unexpected harmful or explicit content that they aren’t looking for," it added. In recent times, deepfake videos of Bollywood stars like Rashmika Mandanna, Alia Bhatt, Priyanka Chopra, Katrina Kaif, etc went viral.
(The above story first appeared on LatestLY on Jan 19, 2024 09:32 AM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).