fe.settings:getUserBoardSettings - non array given[ratanon] - Endchan Magrathea
thumbnail of l0l.jpeg
thumbnail of l0l.jpeg
l0l jpeg
(24.22 KB, 600x314)
davidgolumbia/the-great-white-robot-god-bea8e23943da">https://medium.comdavidgolumbia/the-great-white-robot-god-bea8e23943da
https://archive.fo/P7r7E

> Another of the major figures in promoting the superintelligent AGI god is Eliezer Yudkowsky, whom those of us who read about the alt-right may well be familiar with. Yudkowsky probably isn’t a member of the white supremacist alt-right, but he constantly butts right up against it. …
> Yudkowsky is another one of those obsessed with “rationality” (by which he means exclusively what he considers “logic”) as the only constitutive aspect of mind. This view is at the root of not just AGI, and not just race science, but the entire edifice known as the “alt-right.” While Yudkowsky is able to skirt direct membership in that group, it is no accident that Elizabeth Sandifer’s magnificent 2018 volume whose subtitle is On and Around the Alt Right has the main title Neoreaction a Basilisk, combining Yudkowsky’s fictional thought problem with one of the most poisonous parts of the alt-right, Neoreaction. As Sandifer notes, Mencius Moldbug, aka the computer programmer Curtis Yarvin, partly got his start as a commentator on Unqualified Reservations, the blog where Yudkowsky first whet his uber-rationalist appetites, before going on to form the better-known rationalist site LessWrong. As Sandifer writes, Yudkowsky “is not on the alt-right but has a variety of interesting links to the topic” (3); “in some ways the most basic similarity between him and Moldbug” is that
>  >    they are both animated by an entirely sympathetic anger that people with power are making obvious and elementary errors. But what’s really important is how this sheds light on what exactly Yudkowsky is fleeing from, and in turn on why the Basilisk is the monster lurking at the heart of his intellectual labyrinth. Yudkowsky isn’t just running from error; he’s running from the idea of authority. The real horror of the Basilisk is that the AI at the end of the universe is just another third grade teacher who doesn’t care if you understand the material, just if you apply the rote method being taught. (Sandifer 2018, 60)
> This push-me pull-you attitude toward authority is itself a hallmark of right-wing authoritarianism; a great deal of the writing surrounding Roko’s Basilisk and the Great Robot God is redolent of Wilhelm Reich’s account of “the individual who is adjusted to the authoritarian order and who will submit to it in spite of all misery and degradation” that he puts at the root of fascism in his classic 1934 Mass Psychology of Fascism. So too does the mysticism of the Roko’s Basilisk story.

> In AGI, we see a particular overvaluation of “general intelligence” as not merely the mark of human being, but of human value: everything that is worth anything in being human is captured by “rationality” or “logic,” and soon enough, a quasi-religious revelation will occur that will make that undeniably — transcendentally — true. In other words, God will appear and tell us that white people have been right all along: the thing that they claim they have more of than anyone else will turn out to be the thing that matters more than anything else, the thing that according to which we should ultimately be evaluated, the thing that will save our souls.