Skip to content
Breaking News Alert Republicans Press CDC Over The Unethical, Unchecked 'Wild West' Fertility Industry

Blaming YouTube’s Algorithm Is A Cop-Out

Without a moral anchor, we’re swayed easily by the breeze—and YouTube is a windy place. For those who don’t wish to idle, it’s the go-to resource. 

Share

It’s here, another profile of another young man corrupted by Big Tech’s unruly algorithms, and another opportunity for us to unload our blame on Silicon Valley.

This weekend, the New York Times chronicled 26-year-old Caleb Cain’s journey to and from the alt-right, facilitated by YouTube’s recommendation algorithm, which takes users from zero to 100 on the ideological scale in jarring speed. It’s a fascinating trip, and well-worth the profile for its representative value—and I say that as someone Caleb’s age, having been tempted down my fair share of YouTube rabbit holes.

It’s odd, though, how the impulse of the chattering class is to blame an algorithm, rather than the human behavior that shapes it. I agree that YouTube’s recommendation algorithm steers vulnerable young minds in unfortunate directions—playing a bit part in radicalizing some unknown portion of them—but the question should really be why our minds are vulnerable.

It’s a curiosity borne of our weak moral bearings, and exploited successfully by YouTube’s system to keep users on its website as long as possible. The algorithm gets its power from our moral confusion.

Sometimes this confusion finds a positive antidote on YouTube, in the form of popular personalities with serious answers you won’t find on classroom reading lists like Ben Shapiro or Christina Hoff Sommers. (Jordan Peterson’s popularity is explained well by this dynamic.) Sometimes, as in Caleb’s case, the website’s expansive library of darker content appeals more.

Caleb, who “successfully climbed out of a right-wing YouTube rabbit hole, only to jump into a left-wing YouTube rabbit hole,” is an instructive example. After documenting his YouTube-aided transition from left to right and back again, the Times observed, “It is still difficult, at times, to tell where the YouTube algorithm stops and his personality begins.”

Perhaps this shouldn’t be a surprise. Our political culture is now built largely on shapeshifting internet platforms, which have made flipping partisan allegiances as easy as changing hairstyles. It’s possible that vulnerable young men like Mr. Cain will drift away from radical groups as they grow up and find stability elsewhere. It’s also possible that this kind of whiplash polarization is here to stay as political factions gain and lose traction online.

Without a moral anchor, we’re swayed easily by the breeze—and YouTube is a windy place. For those who don’t wish to idle, it’s the go-to resource.

I think a lot of people our age don’t know what to believe, but feel a strong urge to believe something. It’s hard to know how many Caleb Cains are out there. It’s also hard to know how many people watch extremist videos out of sheer curiosity, and never hesitate to avoid veering down that path themselves. The latter almost certainly outnumber the former.

Blaming the algorithm is easy. Blaming the person who clicks on the videos, or the culture that bred their curiosity—their craving for answers—is much more difficult. But it will also be much more effective.