17 January 2023

This is My Shocked Face

Color me unsurprised that YouTube algorithms deliver conservative content.

This is not a surprise.  Misery, resentment, and rage boost engagement, and conservatism is built on misery, resentment, and rage:

YouTube's recommendation algorithm not only gently traps viewers in mild echo chambers, it is more likely to suggest conservative-leaning videos regardless of your political alignment.

That's according to a study out of New York University's Center for Social Media and Politics (CSMP) that was highlighted this month by Brookings.

Social networks live and die by their recommendation algorithms: they're designed to keep visitors on the sites and apps by feeding them content that keeps them hooked, and if that drives up engagement – sharing links with others, commenting, subscribing, upvoting, etc – all the better. Controversial or mind-blowing material is perfect for this, as it's likely to spark more shares and comments, and thus keep more people addicted.

These feeds are thus personalized to each netizen based on their interests; that's essential to drawing and locking them in. Echo chambers form where people post and share stuff they have a common interest in, and ignore anything contrary to those principles and attitudes. This leads to a positive feedback loop of engagement within the group, reinforcing beliefs and increase time – and ad views – within the app or website.

The algorithms used by social media is extremely, and deliberately, harmful.  It's like Love Canal, with cat pix.


Post a Comment