World

Deepfake Pornography: Apps That Use AI To Undress Women In Photos Gaining Popularity, Says Report

New Delhi: In a worrying trend of non-consensual pornography being developed and distributed because of advances in artificial intelligence (a type of fabricated media known as deepfake pornography), Apps and websites that use artificial intelligence to undress women in photos are soaring in popularity. Its proliferation runs into serious legal and ethical hurdles, as the images are often taken from social media and distributed without the consent, control or knowledge of the subject.

In September alone, 24 million people visited undressing websites, The social network analysis company Graphika has found according to researchers, Bloomberg reported.

Many of these undressing, or “nudify,” services use popular social networks for marketing, according to Graphika. For instance, since the beginning of this year, the number of links advertising undressing apps increased more than 2,400% on social media, including on X and Reddit, the researchers said. The services use AI to recreate an image so that the person is nude. Many of the services only work on women.

The rise in popularity corresponds to the release of several open source diffusion models, or artificial intelligence that can create images that are far superior to those created just a few years ago, Graphika said. Because they are open source, the models that the app developers use are available for free. “You can create something that actually looks realistic,” said Santiago Lakatos, an analyst at Graphika, noting that previous deepfakes were often blurry, Bloomberg reported.

In addition to the rise in traffic, the services, some of which charge $9.99 a month, claim on their websites that they are attracting a lot of customers. “They are doing a lot of business,” Lakatos was quoted as saying. Describing one of the undressing apps, he said, “If you take them at their word, their website advertises that it has more than a thousand users per day.”

Non-consensual pornography of public figures has been worrying privacy experts for long.

“We are seeing more and more of this being done by ordinary people with ordinary targets,” said Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, was quoted as saying in the report, adding, “You see it among high school children and people who are in college.”

Many victims never find out about the images, but even those who do may struggle to get law enforcement to investigate or to find funds to pursue legal action, Galperin said.

 

OB Bureau

Recent Posts

Latest Odisha Breaking News Updates – Wednesday, 22 January 2025

Stay ahead with OdishaBytes Breaking News - your ultimate source for the fastest, most comprehensive,…

6 minutes ago

Cabinet Approves Incentive Package For India’s 1st Silicon Carbide Semiconductor Unit In Odisha

Bhubaneswar: RIR Power Electronics Limited will establish India’s first silicon carbide (SiC) semiconductor manufacturing facility…

6 minutes ago

Odisha Vigilance Nabs Doctor Taking Bribe For Favourable Post-Mortem Report

Bhubaneswar: Odisha Vigilance officials on Wednesday apprehended Biswanath Adhek, a Surgery Specialist at Badamba Community…

27 minutes ago

Latest Odisha Breaking News Updates – Wednesday, 22 January 2025

Stay ahead with OdishaBytes Breaking News - your ultimate source for the fastest, most comprehensive,…

49 minutes ago

Odisha To Have Godabarish Mishra Adarsha Prathamik Vidyalaya In All Panchayats; CM Says 5T Model Was Not Proper

Bhubaneswar: Odisha Cabinet on Wednesday approved a proposal for upgrading government-run primary schools as model…

1 hour ago

Navy Officer From Odisha Goes Missing After Leaving For Kochi; Family Suspects Kidnapping

Berhampur: An Indian Navy officer from Odisha’s Ganjam district has reportedly gone missing under mysterious…

2 hours ago