19 October 2022

Why am I Not Surprised

A study shows that regardless of the choices and preferences of its users, YouTube's algorithms direct users toward right wing content.

I don't think that this is the result of some sort of nefarious political agenda, but rather that cruelty, hate, and selfishness tend to drive engagement, so it trends right:

YouTube's recommendation algorithm not only gently traps viewers in mild echo chambers, it is more likely to suggest conservative-leaning videos regardless of your political alignment.

That's according to a study out of New York University's Center for Social Media and Politics (CSMP) that was highlighted this month by Brookings.

Social networks live and die by their recommendation algorithms: they're designed to keep visitors on the sites and apps by feeding them content that keeps them hooked, and if that drives up engagement – sharing links with others, commenting, subscribing, upvoting, etc – all the better. Controversial or mind-blowing material is perfect for this, as it's likely to spark more shares and comments, and thus keep more people addicted.

To clarify, because the good folks at The Register are too delicate to say it, they find profit in creating unhappiness and mental illness. 

Karl Marx would have predicted this, if he had ever dropped acid.

"We found that YouTube's recommendation algorithm does not lead the vast majority of users down extremist rabbit holes, although it does push users into increasingly narrow ideological ranges of content in what we might call evidence of a (very) mild ideological echo chamber," the academics disclosed in a report for the Brookings Institution.

"We also find that, on average, the YouTube recommendation algorithm pulls users slightly to the right of the political spectrum, which we believe is a novel finding."

The abstract of their paper makes clear that this bump to the right happens "regardless of the ideology" of YouTube viewers.

………

What is more interesting, perhaps, is that YouTube seems to overall lean toward recommending moderately conservative content to users regardless of their political orientation, at least according to the NYU center.

Like I said, the algos generate profit off of misery, so they are tuned to create misery.

This is not something that will be fixed through the profit motive.

0 comments :

Post a Comment