YouTube's Recommendation Algorithm Has a Dark Side
It was 3 A.M., and the smoke alarm wouldn't stop beeping. There was no fire, so I didn't need to panic. I just had to figure out a way to quiet the darn thing and tamp down my ire. I had taken out the battery and pushed and twisted all the buttons to no avail.
Luckily for me, the possible solutions were all laid out in the YouTube tutorial I found. The video helpfully walked me through my options, demonstrating each step. And the fact that it had hundreds of thousands of views reassured me that this might work.
YouTube has become the place to learn how to do anything, from assembling an Ikea cabinet to making a Bluetooth connection with your earbuds. It is a font of tutorials, some very good, some meandering, some made by individuals who have become professionals at it and rake in serious sums through advertising. But many are uploaded by people who have solved something that frustrated them and want to share the answer with the world.
The native language of the digital world is probably video, not text—a trend missed by the literate classes that dominated the public dialogue in the predigital era. I've noticed that many young people start their Web searches on YouTube. Besides, Google, which owns YouTube, highlights videos in its search results.
“How do I” assemble that table, improve my stroke, decide if I'm a feminist, choose vaccinations, highlight my cheeks, tie my shoelaces, research whether climate change is real...? Someone on YouTube has an answer. But the site has also been targeted by extremists, conspiracy theorists and reactionaries who understand its role as a gateway to information, especially for younger generations.
And therein lies the dark side: YouTube makes money by keeping users on the site and showing them targeted ads. To keep them watching, it utilizes a recommendation system powered by top-of-the-line artificial intelligence (it's Google, after all). Indeed, after Google Brain, the company's AI division, took over YouTube's recommendations in 2015, there were laudatory articles on how it had significantly increased “engagement”: Silicon Valley–speak for enticing you to stay on the site longer.
These “recommended” videos play one after the other. Maybe you finished a tutorial on how to sharpen knives, but the next one may well be about why feminists are ruining manhood, how vaccinations are poisonous or why climate change is a hoax—or a nifty explainer “proving” the Titanic never hit an iceberg.
YouTube's algorithms will push whatever they deem engaging, and it appears they have figured out that wild claims, as well as hate speech and outrage peddling, can be particularly so.
Receiving recommendations for noxious material has become such a common experience that there has been some loud pushback. Google did ban a few of the indefensibly offensive high-profile “creators” (though not before helping them expose their views to millions of people), and recently the company announced an initiative to reduce recommending “borderline content and content that could misinform users in harmful ways.” According to Google, this content might be things like “a phony miracle cure for a serious illness” or claims that “the earth is flat.” The change, they say, will affect fewer than 1 percent of all videos.
While it's good to see some response from Google, the problem is deep and structural. The business model incentivizes whatever gets watched most. YouTube's reach is vast. Google's cheap and nifty Chromebooks make up more than half the computers in the K–12 market in the U.S., and they usually come preloaded with YouTube. Many parents and educators probably don't realize how much their children and students use it.
We can't scream at kids to get off our lawn or ignore the fact that children use YouTube for a reason: there's stuff there they want to watch, just like I really needed to figure out how to unplug that beeping catastrophe at 3 A.M. We need to adjust to this reality with regulation, self-regulation and education. People can't see how recommendations work—or how they're designed to keep eyes hooked to the screen. We could ask for no YouTube or “no recommendations” for Chromebooks in schools.
This is just tip of the iceberg of the dangerous nexus of profit, global scale and AI. It's a new era, with challenges as real as that iceberg the Titanic did hit—no matter what the video claims.