Thinking about how the arsing fuck to explain the rationalists to normal people - especially as they are now a loud public problem along multiple dimensions.
The problem is that it’s all deep in the weeds. Every part of it is “it can’t be that stupid, you must be explaining it wrong.”
With bitcoin, I have, over the years, simplified it to being a story of crooks and con men. The correct answer to “what is a blockchain and how does it work” is “it’s a way to move money around out of the sight of regulators” and maybe “so it’s for crooks and con men, and a small number of sincere libertarians” and don’t even talk about cryptography or technology.
I dunno what the one sentence explanation is of this shit.
“The purpose of LessWrong rationality is for Yudkowsky to live forever as an emulation running on the mind of the AI God” is completely true, is the purpose of the whole thing, and is also WTF.
Maybe that and “so he started what turned into a cult and a series of cults”? At this point I’m piling up the absurdities again.
The Behind The Bastards approach to all these guys has been “wow these guys are all so wacky haha and also they’re evil.”
How would you first approach explaining this shit past “it can’t be that stupid, you must be explaining it wrong”?
[also posted in sneer classic]
So… on strategies for explaining to normies, a personal story often grabs people more than dry facts, so you could focus on the narrative of Eliezer trying big idea, failing or giving up, and moving on to bigger ideas before repeating (stock bot to seed AI to AI programming language to AI safety to shut down all AI)? You’ll need the wayback machine, but it is a simple narrative with a clear pattern?
Or you could focus on the narrative arc of someone that previously bought into less wrong? I don’t volunteer, but maybe someone else would be willing to take that kind of attention?
I took a stab at both approaches here: https://awful.systems/comment/6885617
I usually say the following. I’m paraphrasing a spiel I have delivered in person several times and which seems to get things across.
'there’s a kind of decentralized cult called rationalism. they worship rational thinking, have lots of little rituals that are supposed to invoke more rational thinking, and spend a lot of time discussing their versions of angels and demons, which they conceive of as all powerful ai beings.
rationalists aren’t really interested in experiments or evidence, because they want to figure everything out with pure reasoning. they consider themselves experts on anything they’ve thought really hard about. they come up with a lot of apocalypse predictions and theories about race mingling.
silicon valley is saturated with rationalists. most of the people with a lot of money are not rationalists. but VCs and such find rationalists very useful, because they’re malleable and will claim with sincerity to be experts on any topic. for example, when AI companies claim to be inventing really intelligent beings, the people they put forward as supporting these claims are rationalists.’
they’re racists
I just describe it as “computer scientology, nowhere near as successful as the original”.
The other thing is that he’s a Thiel project, different but not any more sane than Curtis Yarvin aka Moldbug. So if they heard of moldbug’s political theories (which increasingly many people heard about because of, well, them being enacted) it’s easy to give a general picture of total fucking insanity funded by thiel money. It doesn’t really matter what the particular insanity is, and it matters even less now as the AGI shit hit mainstream entirely bypassing anything Yudkowsky had to say on the subject.
I think for the Yud variety specifically a good summary is : “Taking all the wrong lessons from science fiction, and building a cult around it’s villains, and celebrating “Rational” villains + a belief in the inevitability of world changing technological progress”