TTP Report: Meta Creates Pages for ISIS, Undermining Anti-Terrorism Efforts

FOR IMMEDIATE RELEASE: February 16, 2023

Contact: Michael Clauw,, 202.780.5750

WASHINGTON, D.C. – Today, Campaign for Accountability (CfA), a nonprofit watchdog group that runs the Tech Transparency Project (TTP), released a report revealing that Facebook has generated over 100 pages for groups like Islamic State and Al Qaeda, despite its ban on U.S.-designated terrorist groups. Some of the pages identified by TTP have lingered on the platform for years, accumulating “likes” and posts with terrorist logos and symbols. Because Facebook is creating, not simply hosting, these pages, the findings raise serious legal liability questions for the company—days before the Supreme Court is set to hear cases challenging Big Tech’s Section 230 immunity around terrorist content.

Read the report.

Campaign for Accountability Executive Director Michelle Kuppersmith said, “The findings in this report are stark: Facebook tools are helping ISIS and other terrorist organizations spread their identity and message. Even worse, Facebook has long been aware of this failure, and still refuses to do what’s needed to stop it.”

Since 2020, TTP has published several reports highlighting Facebook’s auto-generation of pages for white supremacists, militia, and terrorist groups. This is the result of a content-maximizing quirk in how Facebook runs its platform: If a user lists an employer, school, or location in their profile—or checks into a “place”—that does not have an existing page, Facebook automatically creates a page for that term—no matter how violent or hateful.

TTP’s new investigation goes deeper on Facebook’s creation of terrorist pages. Our research identified 108 auto-generated pages for groups or terms affiliated with the Islamic State and its regional offshoots. We also identified auto-generated pages for groups and terms affiliated with Al Qaeda, Al Shabaab, Boko Haram, and Iran’s Islamic Revolutionary Guard Corps—which, like ISIS, are U.S.-designated foreign terrorist organizations banned by Facebook.

The majority of the auto-generated ISIS pages identified by TTP (63) were created between 2014 and 2015 when the group was at the peak of its power. One of these pages, “ISIS (organización terrorista),” has nearly 7,000 likes. Pages like this have endured despite Facebook’s efforts to show it’s cracking down on terrorist content. In April 2018, Facebook boasted about its enforcement efforts against ISIS and Al Qaeda and said its detection technology focuses specifically on the two groups and their affiliates.

TTP’s findings come as the Supreme Court is set to hear two cases involving the hosting and amplification of terrorist content. In Gonzalez v. Google, a lawsuit brought by the family of an American woman killed in an Islamic State attack in Paris argues that YouTube’s amplification of ISIS content through recommendation algorithms is not covered by Section 230. In Twitter v. Taamneh, the Court will consider whether Twitter, Facebook, and Google—by knowingly allowing Islamic State to use their platforms—can be held liable for aiding and abetting international terrorism, regardless of Section 230.

Ms. Kuppersmith continued, “While the Supreme Court’s decision in these cases may have an impact on how Big Tech is held liable for harm, elected leaders must not only rely on the court to hold these companies accountable. Until Big Tech stops actively amplifying harmful content, lawmakers should look at all available tools to mitigate that harm.”

Campaign for Accountability is a nonpartisan, nonprofit watchdog organization that uses research, litigation, and aggressive communications to expose misconduct and malfeasance in public life and hold those who act at the expense of the public good accountable for their actions.