What is ‘Deepfake Video?’ Know Everything About the Face Swap AI Technology, Will Fake News be Uncontrollable?
Deepfake videos become more of a problem in an environment which has a lot of fake news in circulation already.
Any technological invention, when introduced, is always convincing and attractive. But an overuse of the same can result in a crisis difficult to manage. The invention of facial swap with the help of artificial intelligence has led to the rise of what is called ‘deepfake.’ Deepfake videos are those which use artificial intelligence to make anyone do or say anything. It uses a smooth face changing ability. The problem came to fore when last week a video came online featuring Barack Obama in a video calling current US President Donald Trump, ‘a complete deep shit!’ But it was actually a public service awareness campaign driven by Buzzfeed warning about such deepfake videos. Deepfake videos become more of a problem in an environment which has a lot of fake news in circulation already.
What is a deepfake video?
A face swap technology lets you swap the faces of two people. It is a very popular feature on application Snapchat, which only became more popular over the years. An extension of it came into the video formats. Deepfake videos are those which feature realistic face-swaps. A computer program finds common features between two faces and stitches one over the other. If the quality of footage is good, it is hard to make out the difference. This technology became so popular that everyone started making such videos as a showcase of their talent! It sort of became a trend on social media.
At a time when there is a fake news spreading over real issues, deepfake videos are a major concern. With the election time coming in, it is warned to be extra careful about what is being propagated and if it is genuine. Photos being morphed are not a new thing on social media and nor is face swapping but it takes a serious turn when there are videos being churned out. A lot of ‘deepfake’ porn videos are also being made, which is when the technology gets heavily misused. Stars faces were morphed into sex scenes and uploaded on porn sites.
Can one detect the fake videos?
The answer to this is a disappointing no. The current forensic tools are not that developed to recognise the fake videos. The technology to make the swaps is definitely more equipped right now, enabling a smooth swap of two faces. Cybersecurity expert Akash Mahajan was quoted to Times of India, “With deep learning, when you have recurrent multiple steps, it is hard to trace back the trail the machine took to reach the output.” The current tools cannot spot the fakes so easily.
Technology to stop Deepfake is being worked upon
Researchers from Germany’s Technical University of Munich are working upon a way to stop this fakery that is heavily prevalent on the internet today. They are developing same artificial intelligence tools to beat the facial swap abuse. They have developed an algorithm called XceptionNet that quickly spots faked videos posted online. It could be used to identify misleading videos on the internet so that they could be removed when necessary. In the least possible way they can at least reveal if they are manipulated videos in any sort.
“Ideally, the goal would be to integrate our A.I. algorithms into a browser or social media plugin,” Matthias Niessner, a professor in the university’s Visual Computing Group, told Digital Trends. Essentially, the algorithm [will run] in the background, and if it identifies an image or video as manipulated it would give the user a warning.”
While this technology is in the making it will take some time to be implemented. But one needs to be careful about what you are being passed upon. A tendency of the gullible to believe anything and everything they see is what works best for such video makers. A lot of misinformation is floating around these days on platforms like Facebook and WhatsApp. Fake news is a prevalent problem and if deepfake videos start revolving around in the same vein the problem of fake news will reach an uncontrollable stage!
(The above story first appeared on LatestLY on Apr 22, 2018 11:45 AM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).