Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
So…a yacht named “Bayesian” just sank off the coast of Sicily. It was owned by British billionaire Mike Lynch, former CEO of Autonomy. Lynch just barely managed to stay out of trouble with US authorities over fraud charges and will likely owe HP Enterprise a hefty bag for misrepresentations before their multi-billion dollar acquisition. My heart goes out to the innocents and crew who are lost. (Edit: Lynch appears to still be missing)
Hell of a metaphor, isn’t it?
I’m fascinated by the fucking size of that yacht and did some more research and found more stories about obscenely expensive boats going to the bottom.
https://www.superyachttimes.com/news?filter=Casualty
Still feel bad about those missing though.
Ran across an impressively strong sneer whilst looking through Baldur Bjarnason’s link list for the week:
A JUST TRANSITION MEANS RESISTING A.I. (from Scottish Left Review)
Gonna copy-paste the quote Baldur used because god damn:
AI isn’t simply a problematic technology but an apparatus that is shaped by the injustices of our existing social relations and which, in turn, reshapes and intensifies them.
Also got an interesting Tweet from Ed Zitron:
This is entirely my gut instinct, but there is boiling resentment against big tech. Something is shifting, and it’s shifting violently, both in the general public and the media. Blood in the water. People are ready for a change.
I’ve had that same gut instinct before - I’ve kinda had it since Baldur noted tech’s disconnect from the public a month ago. Feels like we’re entering an era where working in tech is treated as a red flag, a sign you’re a money-hungry asshole willing to hurt innocent people to make a quick buck.
:( I just wanted to see how many electrons I could make dance on the head of a wafer, I didn’t mean to hurt anyone :(
HN shocked and appalled that a “founder” cannot make billions running a largely unmoderated social network without facing a few consequences:
https://news.ycombinator.com/item?id=41341353
edit @dgerard already posted this on the top level, I just got triggered by one of the many dupe submissions on HN
highlights include
- standard Signal-bashing
- comparison with other heroes like Kim Dotcom (lol) and Snowden
- outrage that the arrest order was issued while Durov was in the air, so he couldn’t evade justice (this is unconfirmed)
- outrage that being a citizen of a country allows that country to enforce its laws against someone
Durov is rich and accused in a country governed by the rule of law (in contrast to the other country which has issued him a passport, the UAE). He can afford the very best lawyers and will have to be content with staying in Paris while this works out, boo-hoo.
outrage that the arrest order was issued while Durov was in the air, so he couldn’t evade justice (this is unconfirmed)
Clearly arrest warrants should be made like hide-and-seek where the seeker has to count to 100 days before arresting the hider and publish “ready or not here I come!” to their social media account.
but if he’s just being held accountable for running a service where others did nefarious things, then this should be a chilling effect for all founders.
ok. chill them
Problematic developer with AI crush realizes she’s just dust under the wheels of progress
You know those polls that say fewer than 20% of Americans trust AI scientists? It shouldn’t be the case, because no group is doing more right now to elevate the universe to a higher state of complexity. You know who the public does trust? Open source developers.
Justine, LOL indeed.
This is the girl who went off the rails and started posting about neoreaction, right?
Problematic developer with AI crush
I thought this was gonna go in a completely different direction.
Setting: romantic sunset beach, without a crowd in sight. There is no sound but that of a gentle breeze, waves lapping at the shore, and seagulls in the distance. Ryan turns to his girlfriend Tiffany. He gets down on his knees, the sand muddying his pants. Tiffany clasps her hands over her mouth in disbelief. Ryan says: “Tiffany you are the light of my life. You have made me a better man. I can’t see me living with anyone else but you. Will… will you marry m-- bzzt. Thank you for using Virtua-Boyfriend, unfortunately we ran out of VC money so are shutting down.”.
In classic Rationalist fashion, she noticed something she didn’t like, hypothesized a cause with no real evidence, and then proceeded to rant about the implications of that unproven hypothesis.
My suspicion, on the other hand, is that because Claude is just reproducing statistical patterns from its training data is simply reflecting the fact that she is referred to as a Nazi coder far more often than she is as some kind of open-source luminary. Unless her github metadata signs everything “Justine T the open-source developer” then that association isn’t reflected in the patterns extrapolated from the training data.
only a month and a bit sooner and she could have answered my question herself lol
OT: whats the best way to explain rationalism et all to complete normal people with no connection to tech?
Techish (I would use the word techbro here, but that needs an explantion even) people who want to make their fictional science fiction utopia real, but got so scared of their own science fiction ideas going wrong and killing everybody they started a cult around rationality, sort of a Vulcans fan club. They have a pattern where they think they and their methods are smarter and better than actual experts.
When trying to do their own research with an open mind, but they left their minds so open that all kinds of sexists and racists crawled in. Who are welcomed as long as they are verbose enough.
And to close it off, Musk is a fan.
Their minds are open to all ideas, so long as the idea is a closed form solution that looks edgy.
Yes, there is a certain yearning for ‘secret forbidden’ knowledge and contrarianism, but I tried to keep it short and simple. Almost never do they find a solution in anything in the left side of politics.
decentralized cult which worships the concept of rational thinking as superior to evidence. has lots of little rituals which are supposed to invoke rational thinking. uses AI in the place of angels and demons. no core holy texts, but the closest things are a sequence of blog posts and a harry potter fanfic. very influential in silicon valley, very intermingled with various explicitly fascist groups
SV Scientology, they can’t land you a leading role in a summer blockbuster but they sure as hell can put you in the running for AI policy related positions of influence or for the board of a company run by one of their more successful groomings. Their current most popular product is court philosophers for the worst kind of aspiring technofeudalist billionaire.
If this gets them interested you’ll eventually get your chance to do a deep dive to any details of cosmist lore you find relevant.
the problem of my life. I have literally found it easier to explain Scientology than Roko’s basilisk.
It’s a little bit like a tiny version of the Mormons if Joseph Smith had read the collected works of Isaac Asimov instead of the Bible and also his name was Yud.
Or to go with less of a sneer, the Rationalist/TESCREAL/Californian Ideology is a loose grouping of fringe beliefs rooted in old-school science/tech fetishism with a lot of science fiction overlays and libertarian/reactionary politics that effectively define “let ultrawealthy tech capitalists do whatever they want” as the only reasonable choice and make it a moral imperative.
Actually the Mormon thing might work given its a priest I’m talking with. Thank you!
ah, he’d understand people then? Heavy on the slipping into cultishness then
I tried to think of a smaller cult that’s still pretty well-known or influential but kept coming up with Heaven’s Gate or the Branch Davidians.
Longer than I’d intend, but the way I describe it is probably as
-
A mystical Harry Potter based sex cult deeply embedded in the techbro scene. They want what many cults want: to commune with God, achieve immortality or enlightenment, and obtain power in the current world, but they dress it in the trappings of science and computer programming.
-
Do to demographic features, their desire to be clever, and a certain contrarian attitude, they will often seek to rationalise harmful social practices, which leads them to support anti-feminist and race realist positions with shocking frequency.
-
Because of their close connections to the tech scene, along with the personal relationship the cult founder had with Peter Thiel, and the fact that the cult has been indoctrinating kids since the aughts, they are shockingly influential in the AI scene.
-
As most cults, they claim to want to teach people to think correctly (rationally), but they actually value the community of being in a cult (and the potential social networking and financial benefits) over thinking rationally.
-
In terms of style, they like long works with unclear arguments, being clever or witty over being right, and strongly signalling their rationality (sometimes even using good tools), but not allowing that to interfere with the core features of being a cultist.
(1-3) are what I’d consider core. (4-5) are what I’d add if the person seems interested. If they seem really interested, I’d also discuss other connections (e.g. to Effective Altruism, the Future of Humanity Institute, George Mason University, Future Perfect, neoreaction), their ideology in more specific terms (e.g. the Sequences, Roko’s Basilisk), and associated members (e.g. EY, SSC, Aella, SBF).
Personally I might try explaining some of the foundational stuff before going into the big R. Scientism, utilitarianism would be my starting points.
I first read this comment before having coffee and thought that “R. Scientism” was a joke about Asimov’s robot novels.