October 19, 2021
Critical Race Theory, often referred to as CRT, has exploded in interest over the past several months. Despite being developed in the 1980s as an analytical framework for scholars to understand the historical roots and modern impacts of systemic racism, it has catapulted into public and political conversations, becoming one of the most hotly-debated political issues of the day, dominating conservative news coverage and derailing school board meetings all over the country.
Search Interest in “Critical Race Theory”
(Source: Google Trends)
Some have argued that the modern firestorm around CRT is inherently racist, meant to silence discussion of the ways in which Black Americans have been systematically disadvantaged in American society. Critics have insisted that their emphasis on it is an effort against racism, asserting that it unfairly “categoriz[es] individuals into groups of oppressors and victims.” Countless posts on social media platforms like YouTube have earned millions of views litigating these questions and more when it comes to CRT.
Our goal in researching this topic was not to engage in that litigation ourselves, but rather to understand how such discussions of Critical Race Theory, however extreme or not, might lead people down increasingly radical content pathways. To that end, we designed a method for mapping related videos to model possible user journeys during a viewing session. What we found was predictable in some ways but surprising in others; CRT videos largely did not create pathways toward explicitly racist content, but connected hypothetical users to a different brand of extremism: anti-LGBTQ hate.
Methods
We decided to focus on YouTube as it is a popular platform for explaining concepts, and its recommendation algorithms have been implicated in potentially radicalizing some number of users by funneling them into so-called “rabbit holes” of increasingly extreme content. In other words, if someone has just heard about CRT, it’s not hard to imagine they might go to YouTube to find out what it is and, while there, continue watching videos that appear in the sidebar.
With that in mind, we set out to understand where a user journey on YouTube might take a hypothetical user starting with a highly popular video about CRT. Unfortunately, YouTube’s recommendation algorithms are opaque and highly personalized; there is no clean metric for understanding recommended videos, and the company provides no perfect proxy. But the company’s Product team has said that apart from a user’s previously-watched videos, recommended videos are based largely on them being “topically related.” We therefore surmise that “related videos” are liable to be recommended, particularly when they rank highly in YouTube’s list of returned videos.
Armed with our metrics and access to YouTube’s content API, we built a system to take in one “starting point” video and return the five most related videos, then iteratively repeat that process for every new video in the dataset. The end result is a tree-like map of every highly-related video branching up to four degrees out from the starting point—up to 625 new videos if none were to repeat.
Results
We ran our video retrieval and mapping system on the five most-viewed videos returned by YouTube when searching for “critical race theory” in a private browser. In analyzing outputs, we found that videos 2 to 4 degrees of relatedness separation from a number of popular CRT included homophobic content. For example, our map of one of the earliest popular explainers about critical race theory, posted by the Heritage Foundation in December of 2020, led to homophobic and anti-trans channels—such as content denying the legitimate existence of transgender identity or accusing gay people of engaging in “destructive sin”—over 50 times. The map below started with a video with over 2 million views from PragerU, a conservative organization that specializes in “explainers” often targeted at young people.
Content analysis of the videos in the dataset showed that of the five videos related to the PragerU explainer, one connected to homophobic content within two degrees of separation, and another was itself anti-LGBTQ. In all, 15 videos in the 336 video dataset contained homophobic rhetoric, all varying degrees of relatedness to a single critical race theory explainer. Given that the average adult YouTube users spends over 40 minutes on average per viewing session and the average video length is just shy of 12 minutes, it is not hard to imagine some number of viewers opening a CRT video and ending up in a homophobic content pipeline.
Conclusion
It is worth noting that many branches of our YouTube relevancy maps led anywhere nefarious. One cluster in the PragerU map appears heavily focused on dating, while another group of videos largely contained standup bits from comedians Bill Burr and Norm MacDonald. But it’s those seeming non-sequiturs that remind us just how unknown YouTube’s algorithms are to the general public.
“Rabbit holes” do not always follow predictable, step-wise paths. Rather, they can meander and take us places we might never have gone through other means. Whatever the precise mechanism, our analysis suggests that a meaningful share of the PragerU video’s likely hundreds of thousands of viewers could potentially find their way to hateful rhetoric in only a few clicks. It is up to all platforms to make sure that they don’t harbor hateful content and, failing that, that they don’t direct their users right to its doorstep.