Europol Warns of Rise in AI Child Abuse Imagery

AI makes it more difficult to identify real-life victims and perpetrators, the agency said.

Representational Purpose Only (Photo Credits: File Image)

AI makes it more difficult to identify real-life victims and perpetrators, the agency said. It also warned of using so-called "deepfake" technology to mimic real people.European policing agency Europol warned on Monday that it had seen a sharp increase in the number of artificial intelligence-created child sexual abuse images in circulation online.

Also Read | Chandipura Vesiculovirus Alert: Maharashtra Government Issues Advisory After Gujarat Reports 32 Deaths Suspected To Be Linked With Chandipura Virus.

"Cases of AI-assisted and AI-generated child sexual abuse material have been reported," the Hague-based agency said in a new report.

Also Read | Kanwar Yatra 2024: Delhi Government To Set Up 185 Camps for Kanwariyas, Announces AAP Minister Atishi; Check Details.

"The use of AI which allows child sex offenders to generate or alter child sex abuse material is set to further proliferate in the near future," the organization said in a 37-page report on the digital threats facing Europe.

AI-generated images complicate vicitim identification

The report added that the increase in AI-generated images makes it more difficult to identify real-life victims.

In May, a study by the University of Edinburgh in the UK found that some 300 million children a year were victims of online sexual exploitation in some form or another.

The study found that AI had added a new dimension to online abuse in the form of deepfakes of real people.

"Even in the cases when the content is fully artificial and there is no real victim depicted, AI-generated child sex abuse material still contributes to the objectification and sexualization of children," Europol said.

es/lo (AFP, dpa)

(The above story first appeared on LatestLY on Jul 22, 2024 07:10 PM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).

Share Now

Tags


Share Now