DeepNude, the Deepfake AI App That Converts Photos of Women Into Naked Pics Shuts Down After Going Viral
The creators of the deepfake app DeepNude that uses Artificial Intelligence-powered technology to create fake nude pictures of women closed down the app fearing "misuse" hours after it went viral.
San Francisco, June 29: The creators of the deepfake app DeepNude that uses Artificial Intelligence-powered technology to create fake nude pictures of women closed down the app fearing "misuse" hours after it went viral.
The app had been on sale for a few months but its popularity skyrocketed on June 26 after several media reports drew attention to the app. People reported problems in downloading the app which was available for Windows and Linux after the site experienced massive traffic.
The team did not expect such traffic and initially tried to catch up with the demand, before deciding to shut it down completely.
"Hi! DeepNude is offline. Why? Because we did not expect these visits and our servers need reinforcement. We are a small team. We need to fix some bugs and catch our breath. We are working to make DeepNude stable and working. We will be back online soon in a few days," the app said in a tweet on Thursday.
The Twitter bio of the app describes is as "the superpower you always wanted". The software that the team created took photoshopping to the next level.
DeepNude made that the ability to quickly digitally manipulate photographs of women, or in other words, "undress" them, easily available to everyone.
The team made the app available in two versions. The free version would have a large watermark across the images. But people could upgrade to the premium version with $50 to get images with a much smaller watermark on them.
"We created this project for user's [sic] entertainment a few months ago. We thought we were selling a few sales every month in a controlled manner. Honestly the app is not that great, it only works with particular photos," the creators said in a statement late on Thursday.
"We never though it would become viral and we would not be able to control the traffic. We greatly underestimated the request."
The app reported a number of crash alerts on Thursday.
"Despite the safety measures adopted (watermarks), if 500,000 people use it, the probability that people will misuse it is too high. We don't want to make money this way," the creators said.
"Surely some copies of DeepNude will be shared on the web, but we don't want to be the ones who sell it. Downloading the software from other sources or sharing it by any other means would be against the terms of our website."
"The world is not yet ready for DeepNude," they added.
(The above story first appeared on LatestLY on Jun 29, 2019 02:11 PM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).