Alarming Rise in AI-Enhanced Undressing Apps
According to experts, there’s a disturbing surge in the popularity of apps and websites utilizing artificial intelligence to undress women in photos. The implications of these developments are raising concerns among privacy advocates and legal authorities.
September Stats Shock
In just September alone, a staggering 24 million people reportedly visited websites focused on undressing, as revealed by the social network analytics company Graphica.
Nudify Services on the Rise
Graphica further highlights that many of these services, often termed “nudify” apps, strategically employ popular social networks for marketing. Since the beginning of the year, the promotion of these undressing apps on social media platforms like X and Reddit has skyrocketed by over 2,400%, indicating a concerning trend.
AI at the Core
These services utilize artificial intelligence to manipulate images, recreating them in a way that portrays the person in a nude state. Shockingly, some of these apps exclusively target women.
Deepfake Pornography on the Rampage
The rise of such apps is seen as part of a troubling trend in the creation and distribution of non-consensual pornography, driven by advancements in artificial intelligence. This genre, known as deepfake pornography, poses serious legal and ethical challenges as it often involves using images taken from social media without the subject’s knowledge or consent.
One advertisement on X, promoting an undressing app, hinted at users creating nude images and potentially sending them to the person featured in the digitally undressed photo, potentially inciting harassment. Additionally, an app paid for sponsored content on Google’s YouTube and takes the lead in searches for terms like “nudify.” These new “Ai Agents” Could take your Office job Soon.
Google responded by stating they don’t allow ads containing sexually explicit content and are actively removing violating content. However, platforms like X and Reddit have not yet responded to requests for comments.
The Deepening Challenge
Privacy experts express growing worries that AI advancements are making deepfake software more accessible and effective, leading to an increase in non-consensual content creation. Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, notes that this trend is expanding beyond high-profile targets, with ordinary individuals becoming both victims and perpetrators.
Legal Gaps and Enforcement Challenges
The lack of federal laws specifically addressing the creation of deepfake pornography complicates efforts to combat this issue. Even when victims are aware of the images, many face challenges in getting law enforcement to investigate or securing the funds for legal action. Google’s powerful new Gemini AI model is now available in Bard and Pixel Pro
Platform Responses
In response to the issue, TikTok has taken measures to block the keyword “undress,” warning users that it may be associated with content violating guidelines. Meta Platforms Inc. has also started blocking keywords associated with undressing apps searches.
The concerning rise of AI-driven undressing apps underscores the urgent need for comprehensive legal frameworks and proactive measures by online platforms to curb the proliferation of non-consensual and harmful content.