If you’re a big-headed guy or gal at a rationalist puddle cuddle, double check that your rubbers didn’t get punctured.

74 points

yet another programmer who imagines that having computer science baccalaureate gives him an insight into human biology. sigh.

permalink
report
reply
42 points

Much like a network, the brain is a series of tubes.

permalink
report
parent
reply
23 points

To have the confidence of a white CS undergrad…

permalink
report
parent
reply
50 points
*

Feels like I’ve heard this rhetoric before…

permalink
report
reply
21 points

Aw, man… Guess it’s time to play some Disco Elysium again!

permalink
report
parent
reply
13 points

His time has come!

permalink
report
parent
reply
10 points

Thought Gained: Advanced Race Theory

permalink
report
parent
reply
4 points

Wait no they’re talking about Race Theory Criticality Events

permalink
report
parent
reply
46 points

Yay, let’s fight AI with eugenics!

permalink
report
reply
33 points

shot:

The upper bound for how long to pause AI is only a century, because “farming” (artificially selecting) higher-IQ humans could probably create competent IQ 200 safety researchers.

It just takes C-sections to enable huge heads and medical science for other issues that come up.

chaser:

Indeed, the bad associations ppl have with eugenics are from scenarios much less casual than this one

going full “villain in a Venture Bros. episode who makes the Monarch feel good by comparison”:

Sure, I don’t think it’s crazy to claim women would be lining up to screw me in that scenario

permalink
report
reply
17 points

Lol, the guy goes normal weird high school level shitposter when he has the slightest pushback.

permalink
report
parent
reply
25 points
*

Considering that the idea of the singularity of AGI was the exponential function going straight up, I don’t think this persons understands the problem. Lol, LMAO foomed the scorpion.

(Also that is some gross weird eugenics shit).

E: also isn’t IQ a number that gets regraded every now and then with an common upper bound of 160? I know the whole post is more intended as vaguely eugenics aspirational but still.

Anyway, time to start the lucrative field of HighIQHuman safety research. What do we do if the eugenics superhumans goals don’t align with humanity?

permalink
report
reply
24 points
*

Smh, why do I feel like I understand the theology of their dumb cult better than its own adherents? If you believe that one day AI will foom into a 10 trillion IQ super being, then it makes no difference at all whether your ai safety researcher has 200 IQ or spends their days eating rocks like the average LW user.

permalink
report
parent
reply
7 points

Oh absolutely! This is the entire delusion collapsing on itself.

Bro, if intelligence is, as the cult claims, fully contained self improvement, --you could never have mattered by definition–. If the system is closed, and you see the point of convergence up ahead… what does it even fucking matter?

This is why Pascal’s wager defeats all forms of maximal utilitarianism. Again, if the system is closed around a set of known alternatives, then yes. It doesn’t matter anymore. You don’t even need intelligence to do this. You can do with sticks and stones by imagining away all the other things.

permalink
report
parent
reply

SneerClub

!sneerclub@awful.systems

Create post

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it’s amusing debate.

[Especially don’t debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

See our twin at Reddit

Community stats

  • 139

    Monthly active users

  • 246

    Posts

  • 3.2K

    Comments