Apps That Use AI To Undress Women In Photos Proliferating: Report

World Edited by
Apps That Use AI To Undress Women In Photos Proliferating: Report

Apps That Use AI To Undress Women In Photos Proliferating: Report

In a shocking turn of events, the number of visits garnered by apps and websites that use artificial intelligence to undress women in photos has surged excessively, a report by social network analysis company Graphika revealed.

In September alone, 24 million people visited undressing websites, according to a report. Undressing, or “nudify,” services penetrate online through heavy advertising, which saw an increase of more than 2,400% on social media, including X and Reddit. They provide AI-generated nude images of people in a photo, and many of them only work on women”s photos.

These apps have the capacity to menace people”s normal lives, causing the proliferation of non-consensual or deep-fake pornography, the experts warned. Nude photos can be created with social media photos and virtually identical photos created by AI.

“You can create something that actually looks realistic,” Bloomberg reported quoting Santiago Lakatos, an analyst at Graphika.

According to the report, an advertisement for undressing appeared on a social media platform and hailed detrimental trends such as intimidating people using fake photos. One of the apps had the nerve to play sponsored content on YouTube.

However, a Google spokesperson said they removed the content because it violated their policies. The company doesn”t allow ads “that contain sexually explicit content.”

“We”ve reviewed the ads in question and are removing those that violate our policies,” the company said, according to the Bloomberg.

Reddit also stepped up its actions against such adds, banning several domains. Appallingly, with subscription rates of up to $9.99, such nudify apps attract a large number of users. One of such apps cited an incremental number of premium users on their website.

“They are doing a lot of business,” Lakatos said. Describing one of the undressing apps, he said, “If you take them at their word, their website advertises that it has more than a thousand users per day.”

“We are seeing more and more of this being done by ordinary people with ordinary targets,” said Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation. “You see it among high school children and people who are in college.”

TikTok has blocked the popular term “undress” associated with the services, warning anyone searching for the word that it “may be associated with behavior or content that violates our guidelines,” according to the app. Meta Platforms Inc. also came up with such actions against undressing apps.

However, the countries still have to create a law that bans these websites.