Apple, Google host dozens of AI ‘nudify’ apps like Grok, report finds
Dozens of AI apps generating non-consensual nude images have been downloaded over 700 million times, raising concerns of policy enforcement and privacy risks, Tech Transparency Project found.
- On Tuesday, Tech Transparency Project found 55 nudify apps on Google Play and 47 in the Apple App Store, all available as of Jan. 21.
- Improved AI models have lowered the barrier for sexualized deepfakes, and Tech Transparency Project found apps with over 700 million downloads generating $117 million, benefiting Apple and Google.
- To test the apps, Tech Transparency Project searched 'nudify' and 'undress' and used AI-generated test images, finding many apps produced sexualized or nude outputs and face-swap apps superimposed faces.
- Apple told CNBC that 28 apps TTP identified were removed after outreach, though TTP reported only 24 removals, and a Google spokesperson said it suspended several apps while investigating violations.
- Regulators have responded: Katie Paul warned `The fact that they are not adhering to their own policies...` and concerns include Chinese apps rated for children and data-privacy issues, Paul said.
29 Articles
29 Articles
It is estimated that together these apps generated almost $117 million, which shows how lucrative this business is.
Apple and Google App Stores Offer Dozens of AI-Powered 'Nudify' Apps in Wake of Elon Musk's Grok Scandal
Tech giants Apple and Google are offering numerous AI-powered applications on their app stores that can generate non-consensual nude images from ordinary photographs, according to a new report from industry watchdog Tech Transparency Project. The report comes after a global scandal erupted at Elon Musk's Grok AI generated sexual deepfakes of women and children and posting them on X.
Coverage Details
Bias Distribution
- 40% of the sources lean Left, 40% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium
















