
Apple and Google say they police sexual content, yet their own app-store search tools are still steering users—sometimes kids—straight to AI “nudify” apps.
Quick Take
- Investigators found Apple App Store and Google Play searches for “nudify,” “undress,” and “deepnude” surfaced apps that can generate non-consensual nude images of real people.
- The Tech Transparency Project reported these apps were promoted not just by search results, but also by autocomplete suggestions and paid ads.
- Research cited 483 million downloads and more than $122 million in revenue tied to the nudify-app ecosystem.
- Dozens of apps were rated as suitable for minors, amplifying concerns amid rising school deepfake scandals.
How App Stores Became a Roadmap to “Nudify” Tools
The Tech Transparency Project (TTP) reported that users searching Apple’s App Store and Google’s Play Store for terms commonly associated with AI “undressing” were quickly shown apps that can digitally remove clothing from photos of real people. TTP emphasized that this is not merely a case of a few bad apps slipping through review; the stores’ own discovery features—search rankings, autocomplete prompts, and ad placements—can function like a directory for abuse.
TTP said testing found a sizable share of top search results were capable of generating explicit outputs, including non-consensual nude images and other sexualized content. The report also described how some apps present themselves as generic photo editors or face-swapping tools, which can make enforcement harder if reviewers rely on surface-level descriptions rather than real-world use patterns. The practical effect is simple: the major mobile gatekeepers can unintentionally—or negligently—make harmful tools easy to find.
Why Parents and Schools Are Sounding the Alarm
The most politically unifying part of this story is also the most disturbing: minors are in the blast radius. TTP’s research highlighted dozens of apps that were rated as appropriate for younger users, even though the underlying capability can be used to generate explicit images of classmates, teachers, or other targets. That concern tracks with broader reporting about deepfake incidents in schools, where students have been victimized by synthetic nude images circulated for humiliation and harassment.
Families don’t have to be plugged into tech policy to understand what’s at stake. A child doesn’t need sophisticated skills to misuse these tools if the app stores deliver them through basic keyword searches. That reality undercuts the comforting assumption that app-store curation is a dependable “parental filter” by default. When Big Tech’s storefronts act as the distribution channel, parents are left trying to outmaneuver systems they don’t control.
Follow the Incentives: Revenue, Commissions, and Ads
TTP cited mobile analytics estimating the nudify-app market reached hundreds of millions of downloads and generated more than $122 million in revenue. That kind of money matters because it intersects with the business model of the platforms themselves. Apple and Google typically take a commission on in-app purchases and subscriptions, and both stores run advertising ecosystems that can place paid promotions directly in front of users searching for these terms. Even if a platform claims it prohibits the content, the incentives can collide with enforcement.
Apple, after press inquiries, removed a set of apps identified in reporting, while also indicating that some apps were not considered violations under its interpretation of the rules. Public reporting did not describe an equivalent, clearly documented removal wave by Google tied to the same inquiry window. The difference in responses does not, by itself, prove one platform is “worse,” but it does reinforce the central point: enforcement appears inconsistent, reactive, and dependent on outside investigators to force attention.
The Bigger Political Problem: Private Gatekeepers, Public-Scale Harm
Conservatives have long argued that concentrated corporate power can become a shadow regulator—setting rules, shaping culture, and controlling access—without the transparency Americans expect from public institutions. This episode adds a different twist: the same gatekeepers that aggressively police certain categories of speech and content can struggle to stop plainly predatory tools when those tools are packaged as profitable apps. Critics say this looks less like effective stewardship and more like a system that responds only when embarrassment arrives.
Policy responses are still developing, and available reporting leaves gaps about what specific technical changes Apple and Google will make to search, autocomplete, and ad review. What is clear from the investigation is that “ban it in the guidelines” is not the same as blocking it in practice. If lawmakers focus on measurable compliance—search suppression, ad prohibitions, stronger age-gating, and faster removals—Americans could see whether Big Tech is serious or simply managing public relations.
Sources:
Apple and Google Are Steering Users to Nudify Apps
Apple, Google Offer ‘Nudify Apps’ Despite Policies Against Them
Apple, Google offer nudify apps despite policies against them
Apple and Google host and promote “undressing” apps, report claims































