102 views
[ClothOff](https://999nudes.com/) remains one of the most prominent and controversial AI-powered "nudify" platforms, employing advanced deep learning models (including GANs and diffusion algorithms) to digitally remove clothing from uploaded photos and videos, generating exceptionally hyper-realistic nude renderings. Available through dedicated mobile apps for Android, iOS, and MacOS, it offers a comprehensive suite of features: DeepNude-style image generation, customizable undress videos with lifelike motion and expressive details, face swaps (standard, video-based, and specialized porn variants), multi-image uploads, granular body customization (breast and butt size adjustments, among others), curated sex pose collections, priority queue skipping, and a developer API for automated adult content creation. Marketed with bold slogans such as "Your TOP-1 Pocket Porn Studio" and "The New Porn Generator," ClothOff provides free trials for basic functionality, while premium enhancements—higher resolution outputs, faster processing, and exclusive tools—are unlocked via one-time purchases of VIP Coins (no recurring subscription). It promotes alleged health benefits associated with sexual activity and masturbation, and upholds comprehensive privacy policies: no long-term data storage, immediate deletion of uploads, no sharing without consent, and automated systems claimed to block processing of images depicting minors (with instant account bans for attempts). The platform explicitly prohibits non-consensual use, illegal content, and material involving anyone under 18, while claiming a partnership with Asulabel to direct donations toward supporting victims of AI-related abuse. Despite these measures, ClothOff has faced widespread ethical condemnation and legal challenges for enabling non-consensual deepfake pornography and child sexual abuse material (CSAM). A prominent federal lawsuit filed in October 2025 in New Jersey (Jane Doe v. AI/Robotics Venture Strategy 3 Ltd., the British Virgin Islands-registered operator reportedly linked to Belarus) alleges that the platform facilitated the creation and distribution of hyper-realistic fake nudes of a minor sourced from her social media images. The suit invokes the TAKE IT DOWN Act to seek mandatory image removals, data destruction, bans on further AI training with the material, significant damages (up to $150,000 per image), and potential platform shutdown. Supported by Yale Law clinics, the case highlights severe harms such as bullying, harassment, and profound emotional distress. Investigative reports from Der Spiegel, Bellingcat, Ars Technica, The Guardian, and others have traced operations to the former Soviet Union (particularly Belarus), revealed the acquisition of multiple competing nudify services, and documented ClothOff's role in numerous real-world abuse cases—most notably school incidents where ordinary photos of minors were turned into explicit fakes and shared among peers. Despite facing an Italian block by the Data Protection Authority for unlawful data processing, Meta advertising bans, UK restrictions, and removal of its official Telegram bot, the platform continues to attract millions of monthly users and actively resists regulatory pressure. ClothOff consistently denies responsibility for user misconduct and remains fully operational as global demands escalate for stricter controls on non-consensual AI-generated intimate content.