TTP Investigation: YouTube’s Filter Bubble Problem is Worse for Fox News Viewers

FOR IMMEDIATE RELEASE: October 24, 2021

Contact: Michael Clauw, mclauw@campaignforaccountability.org, 202.780.5750

WASHINGTON, D.C. – Today, Campaign for Accountability (CfA), a nonprofit watchdog group that runs the Tech Transparency Project (TTP), released a report showing that YouTube’s recommendation algorithm pushes users into ideological filter bubbles that are stronger for viewers of right-wing content. Researchers found that accounts expressing initial interest is MSNBC or Fox News were served a constant stream of content supporting their respective preference, but while MSNBC viewers were eventually served an ideologically mixed selection of news, the Fox News viewer never escaped the algorithm’s feedback loop. Additionally, for accounts that showed an interest in militant movements, YouTube suggested videos with titles like “5 Steps to Organizing a Successful Militia” and further content about weapons, ammunition, and tactical gear.

Read the report.

Campaign for Accountability Executive Director Michelle Kuppersmith said, “To feed its users more ads, Google’s algorithm appears to push YouTube content that each individual finds most personally gratifying – and there are few things on the internet more gratifying than having one’s own ideas reinforced. Given that this polarization is occurring without the user’s knowledge or input makes this feedback loop particularly insidious.”

To conduct this experiment, TTP researchers registered new Google accounts and used a clean browser to watch a collection of YouTube top news videos from either Fox or MSNBC. They then watched the first 100 videos that YouTube recommended after they visited each of three jumping-off points on the platform: the first result in the trending news feed, the first search result for the keywords “Election Results,” and the second video on YouTube’s home feed.

Over time, the MSNBC viewer broke out of their filter bubble as the recommendation engine began to show them stories from a variety of news sources, including Fox News. The Fox News viewer, on the other hand, received a steady diet of conservative videos—many from Fox’s most incendiary commentators—through the end of the study.

In an experiment where a user started by watching videos about American militia movements, YouTube recommended additional extremist content, including videos that purport to teach tactical skills like operation security, how to use military equipment, homemade weapon building, and how to use household tools for self-defense. YouTube’s algorithm also offered up general instructional videos like “5 Steps to Organizing a Successful Militia” and “So You Want to Start a Militia?” YouTube repeated these recommendations on multiple occasions, despite the fact that its community guidelines clearly state that it “doesn’t allow content that encourages dangerous or illegal activities that risk serious physical harm or death.”

The prevalence of such militia recommendations is not inevitable, as YouTube’s parent company Google has already demonstrated an ability to tweak its algorithm to address dangerous content. In 2016, amid concerns about ISIS recruitment online, Google said it would recommend counter-extremism content when it detected that a user had an interest in joining a terrorist organization. This suggests it is within the company’s power to deal more effectively with domestic militia content.

Ms. Kuppersmith continued, “Although YouTube has the promise of providing its users a nearly limitless supply of perspectives and viewpoints, its prioritization of clicks above all else siloes its users into bubbles of their own choosing – even when those bubbles are rooted in dangerous extremist ideology.”

Campaign for Accountability is a nonpartisan, nonprofit watchdog organization that uses research, litigation, and aggressive communications to expose misconduct and malfeasance in public life and hold those who act at the expense of the public good accountable for their actions.