22 October 2025

Headline of the Day

Treat Big Tech like Big Tobacco
Joel Wertheimer noting the obvious, big tech is a problem because they actively market a harmful product.

This ain't rocket science.

The tech bros are making their money by promoting body dysmorphia, bigotry, discrimination, ethnic cleansing, fraud, etc.

They want engagement (or in the case of crypto, marks) and they will actively harm people to get this.

The problem with Big Tobacco was not that it could charge excess prices because of its market power. The problem with Big Tobacco was that cigarettes were too cheap. Cigarettes caused both externalities to society and also internalities between the higher-level self that wanted to quit smoking and the primary self that could not quit an addictive substance. So, we taxed and regulated their use.

The fight regarding social media platforms has centered around antitrust and the sheer size of Big Tech companies. But these platforms are not so much a problem because they are big; they are big because they are a problem. Policy solutions need to actually address the main problem with the brain-cooking internet.

I love that bon mot, "Brain cooking internet."

………

For three decades, internet providers were merely passive hosts of third-party content, immune from liability faced by publishers. That grant of immunity was foundational, and the costs of such freedom were wildly outweighed by the benefits of the internet.

Both of these facts are no longer true. Social media companies are no longer passive hosts but active curators. And the costs of these products are now too high to ignore. They make us addicted to their apps with slot machine-style precision, and they are now helping creators fake reality with text-to-video generation.

The answer is not to destroy these companies or pull the government into the messy and probably unconstitutional world of directly regulating speech. The answer is to remove the special protections they have been granted and finally allow people harmed by these products to hold these companies liable.

………

Recommendation algorithms have allowed large platforms to turn our attention into a solved game. I say this with a lot of trepidation at a time when free speech is seriously threatened, particularly after seeing the harms that the Fight Online Sex Trafficking Act wreaked on sex workers. But the time has come to amend Section 230 of the Communications Decency Act (CDA 230). Specifically, lawmakers should remove protections for platforms that actively promote content using reinforcement learning-based recommendation algorithms.

 The argument behind the safe harbor provisions was that websites which allowed users to share their opinions were like bookstores, and bookstores are not held liable for the content of their books.

On the other hand, newspapers ARE held liable for the content of the letters ot the editor that they publish, because they make a conscious decision about which letters to publish.

Their algorithms, and  soon their AI slop are conscious editorial decisions, and these decisions are made to the detriment of their users.

0 comments :

Post a Comment