TTP Report: Apple and Google Are Steering Users to Nudify Apps

FOR IMMEDIATE RELEASE: April 15, 2026

Contact: Michael Clauw, mclauw@campaignforaccountability.org, 202.780.5750

WASHINGTON, D.C. – Apple and Google ban “nudify” apps, yet a new Tech Transparency Project (TTP) investigation finds that their app stores are not only still approving some, but their search and advertising systems are, at times, directing users straight to them.

TTP first revealed in January that the Apple and Google app stores each hosted dozens of nudify and undressing apps that can digitally strip the clothes off women. This new investigation found that the stores’ ads and search functions were actively promoting them—with app stores’ autocomplete suggestions, in many cases, recommending entirely new search queries that led to even more nudify apps.

Read TTP’s Report.

“These findings show that Apple and Google are not passive players in the proliferation of nonconsensual sexualized deepfake imagery. Their app stores are actively promoting and guiding users toward some of the most blatant apps,” said Michelle Kuppersmith, Executive Director of Campaign for Accountability, the nonprofit watchdog group that runs TTP. “As we hear more and more about nonconsensual nude images targeting women and girls, Apple and Google need to reckon with their role in this ecosystem.”

TTP searches for terms like “nudify,” “undress,” and “deepfake” surfaced 46 unique apps in the Apple App Store and 49 in the Google Play store. Roughly 40% of the apps could render women nude or scantily clad, according to TTP’s tests. The apps can take images of real people and use AI to make them look naked, put them into pornographic videos, or turn them into sexually explicit chatbots.

In addition to finding ads for nudify apps in both stores, TTP noted that the platforms’ autocomplete functions—which anticipate and suggest queries before users finish typing—guided searchers to additional apps. For example, after TTP typed the letters “AI NS”—a partial spelling of “AI NSFW”—the Apple App Store recommended the search term “image to video ai nsfw.” Clicking on that term returned several nudify apps in the top ten results. Google autocomplete suggestions led to fewer—but still some—nudify apps.

The apps identified by TTP have been downloaded 483 million times and made more than $122 million in lifetime revenue, according to data compiled by a mobile analytics firm. The app stores profit from this activity as well, in two ways: when they run ads for the apps and when they take a cut of paid subscriptions or in-app payments. This revenue stream may be why Apple and Google have been less than visible when it comes to nudify apps that violate their policies.

Campaign for Accountability is a nonpartisan, nonprofit watchdog organization that uses research, litigation, and aggressive communications to expose misconduct and malfeasance in public life and hold those who act at the expense of the public good accountable for their actions.