Apps and Web sites that use artificial intelligence (AI) to undress women in photos are soaring in popularity, researchers said.
In September alone, 24 million people visited undressing Web sites, the social network analysis company Graphika said.
Many of these undressing, or “nudify,” services use popular social networks for marketing, Graphika said.
For instance, since the beginning of this year, the number of links advertising undressing apps increased more than 2,400 percent on social media, including on X and Reddit, the researchers said.
The services use AI to recreate an image so that the person is nude. Many of the services only work on women.
These apps are part of a worrying trend of nonconsensual pornography being developed and distributed because of advances in AI — a type of fabricated media known as deepfake pornography. Its proliferation runs into serious legal and ethical hurdles, as the images are often taken from social media and distributed without the consent, control or knowledge of the subject.
One image posted to X advertising an undressing app used language that suggests customers could create nude images and then send them to the person whose image was digitally undressed, inciting harassment.
One of the apps, meanwhile, has paid for sponsored content on Google’s YouTube, and appears first when searching with the word “nudify.”
A Google spokesperson said the company does not allow ads “that contain sexually explicit content. We’ve reviewed the ads in question and are removing those that violate our policies.”
Neither X nor Reddit responded to requests for comment.
Nonconsensual pornography of public figures has long been a scourge of the Internet, but privacy experts are growing concerned that advances in AI technology have made deepfake software easier and more effective.
“We are seeing more and more of this being done by ordinary people with ordinary targets,” Electronic Frontier Foundation cybersecurity director Eva Galperin said. “You see it among high school children and people who are in college.”
Many victims never find out about the images, but even those who do might struggle to get law enforcement to investigate or to find funds to pursue legal action, Galperin said.
There is currently no federal law in the US banning the creation of deepfake pornography, though the government does outlaw generation of these kinds of images of minors. In November, a North Carolina child psychiatrist was sentenced to 40 years in prison for using undressing apps on photos of his patients, the first prosecution of its kind under law banning deepfake generation of child sexual abuse material.
TikTok has blocked the keyword “undress,” a popular search term associated with the services, warning anyone searching for the word that it “may be associated with behavior or content that violates our guidelines,” according to the app.
A TikTok representative declined to elaborate.
In response to questions, Meta Platforms Inc also began blocking key words associated with searching for undressing apps. A spokesperson declined to comment.
‘TERRORIST ATTACK’: The convoy of Brigadier General Hamdi Shukri resulted in the ‘martyrdom of five of our armed forces,’ the Presidential Leadership Council said A blast targeting the convoy of a Saudi Arabian-backed armed group killed five in Yemen’s southern city of Aden and injured the commander of the government-allied unit, officials said on Wednesday. “The treacherous terrorist attack targeting the convoy of Brigadier General Hamdi Shukri, commander of the Second Giants Brigade, resulted in the martyrdom of five of our armed forces heroes and the injury of three others,” Yemen’s Saudi Arabia-backed Presidential Leadership Council said in a statement published by Yemeni news agency Saba. A security source told reporters that a car bomb on the side of the road in the Ja’awla area in
PRECARIOUS RELATIONS: Commentators in Saudi Arabia accuse the UAE of growing too bold, backing forces at odds with Saudi interests in various conflicts A Saudi Arabian media campaign targeting the United Arab Emirates (UAE) has deepened the Gulf’s worst row in years, stoking fears of a damaging fall-out in the financial heart of the Middle East. Fiery accusations of rights abuses and betrayal have circulated for weeks in state-run and social media after a brief conflict in Yemen, where Saudi airstrikes quelled an offensive by UAE-backed separatists. The United Arab Emirates is “investing in chaos and supporting secessionists” from Libya to Yemen and the Horn of Africa, Saudi Arabia’s al-Ekhbariya TV charged in a report this week. Such invective has been unheard of
US President Donald Trump on Saturday warned Canada that if it concludes a trade deal with China, he would impose a 100 percent tariff on all goods coming over the border. Relations between the US and its northern neighbor have been rocky since Trump returned to the White House a year ago, with spats over trade and Canadian Prime Minister Mark Carney decrying a “rupture” in the US-led global order. During a visit to Beijing earlier this month, Carney hailed a “new strategic partnership” with China that resulted in a “preliminary, but landmark trade agreement” to reduce tariffs — but
SCAM CLAMPDOWN: About 130 South Korean scam suspects have been sent home since October last year, and 60 more are still waiting for repatriation Dozens of South Koreans allegedly involved in online scams in Cambodia were yesterday returned to South Korea to face investigations in what was the largest group repatriation of Korean criminal suspects from abroad. The 73 South Korean suspects allegedly scammed fellow Koreans out of 48.6 billion won (US$33 million), South Korea said. Upon arrival in South Korea’s Incheon International Airport aboard a chartered plane, the suspects — 65 men and eight women — were sent to police stations. Local TV footage showed the suspects, in handcuffs and wearing masks, being escorted by police officers and boarding buses. They were among about 260 South