Also, in This Universe, Elon Musk Found His Highest Personal Calling as Dinner for a Family of Hogs Some 20 Years Ago
—Defector, describing a non-broken universe in which Elon Musk's child pr0n creation machine did not exist.
This made me laugh out loud.
Twitter, also called X, the social media network owned and constantly used by the world's richest man as well as virtually every powerful person in the American tech industry, and on which the vast preponderance of national political figures also maintain active accounts, has a sexual harassment and child sexual abuse material (CSAM) problem.I disagree a bit. I do not think that this is a problem, I think that this was a deliberate choice.
This has been true more or less since Elon Musk took it over, but this problem's latest and most repellent efflorescence is the result of one of Musk's signature additions as owner. Grok, the network's embedded AI chatbot, will—or would, as recently as yesterday, and certainly did many, many times—generate graphically sexualized images of real people, including minors and non-consenting third parties, in response to any user's request. Another Twitter feature bearing Elon Musk's fingerprints is that the site filled with the kind of people who, when a photograph of a 14-year-old TV actress appears on their timeline, will ask Grok to generate an image of her without clothes on. As a result, for much of this week Twitter has been rife with AI-generated revenge porn, deepfake celebrity porn, and CSAM.
And then they go in for the kill:
………
Surely, however, to the extent that many laws prohibit the creation and proliferation of CSAM and revenge porn, someone is responsible for Twitter's embedded chatbot having spent much of the past week visibly and flagrantly breaking those laws. Yes? If a cineplex spends a week showing CSAM in one of its theaters, a posted apology purportedly in the voice of the screen seems unlikely to get that cineplex's owners and operators off the hook, legally speaking. If the owner of a roadside billboard lets strangers use the thing, and one of them tacks a huge sexualized image of a child—or an unconsenting celebrity, or an ex-girlfriend he's mad at for dumping him—on there, no one would accept a legal consequence which stopped at the assurance that the billboard's fucking support stanchions sincerely feel real bad about it.
So. Who's standing trial for this one? Who's going to jail?
I think it's important for a person to hold an imaginary sane and just society in their mind, against which to measure the actual society that exists instead. In a sane and just society ... well, Grok probably doesn't exist in the first place in a society like that. (Also, in this universe, Elon Musk found his highest personal calling as dinner for a family of hogs some 20 years ago.) But somewhere along the wide spectrum of social operability in between that sane and just place and the shambling, clattering junk heap where I wrote this blog and you're reading it, there is a theoretical place in which OK, yes, Grok exists, but maybe not absolutely everything has yet been motherfucked all the way out of function.
(emphasis mine)
I would note that the response of Ecch/Twitter/Grok has been to GEOFENCE the feature, not shut it down, which means that people in countries where fake revenge porn is legal, and those with sufficiently sophisticated VPNs, can continue to generate this slash-fic bullshit.
This shows us something: This came from Elon.
If Musk were not actively supporting this, no one would suggest this to him, because of fear that they would be perceived as setting him up for an Epstein style take-down.
The creation of this feature, and its continued availability in limited areas had to come from the top.


0 comments :
Post a Comment