/ratanon/ - Rationalists Anonymous

Remember when /ratanon/ was good?


New Reply on thread #6273
X
Max 20 files0 B total
[New Reply]

[Index] [Catalog] [Banners] [Logs]
Posting mode: Reply [Return]


thumbnail of 53hfw74cferdf5gh655yt.jpg
thumbnail of 53hfw74cferdf5gh655yt.jpg
53hfw74cferdf5g... jpg
(44.08 KB, 600x751)
What's your EA horror story, Ratanon? 

I've read and heard countless reports on toxic power games, nepotism, lack of transparency, skyrocketing burnout rates, fake empathy, sexual misconduct allegations or exploitative practices. What's going on? 

Is it because every big movement attracts Normies (occasional contributors), Nerds (seekers of meaning) and Sociopaths (malevolent influencers/leaders), but EA particularly attracts and rewards game-playing Sociopaths, as the cold utilitarian framework gets (poorly) extrapolated on organizational and interpersonal dynamics?

How does the EA culture compare to academia, freelance work or IT industry in these regards?
 >>/6273/
> Is it because every big movement
EA is a subculture at best. Actually, thinking about it, that probably explains a fair bit. Subcultures could be seen as on a spectrum of both who they attract and how well they handle problems.

thumbnail of mitoticcephalopod-tumblr-com-post-184520531241.png
thumbnail of mitoticcephalopod-tumblr-com-post-184520531241.png
mitoticcephalopod-t... png
(32.66 KB, 514x550)
(epistemic status: don't trust any of this)

I read somewhere the other day (in an old SSC post?) that communes work better when they require some significant personal sacrifice, but only if they're religious. It doesn't work for secular communes for some reason.

Maybe EA accidentally tapped into whatever makes religious communes tick, by really heavily leaning into the personal sacrifice part and believably presenting it as a moral obligation that's not discharged even by leaving the community.

And then all the misconduct is just the regular old thing you get in religious communities.



> toxic power games
> nepotism
> lack of transparency
Sounds like par for the course for most organizations.

> skyrocketing burnout rates, 
> exploitative practices
Same as above but significantly worsened by the whole "it's my duty to save the world", "me having a Frappuccino a day means that every year a jolly PoC in Zambia will starve to death because I didn't donate that money" "Can't stop coding… basilisk will eat me…" logic that underlines the movement.

> sexual misconduct allegations
I wrote something edgy but then remembered getting flippant about CW issues is déclassé so I deleted it.



EA is worse than average because the legible philosophy preselects for people who are bad at game theory, which second order selects for people who are good at game theory. Altruism is another word for playing CooperateBot in the prisoner's dilemma, and when you have a bunch of CooperateBots in one place, not just socipathic DefectBots, but also otherwise cool PrudentBots will come in to consume the surplus.

"You'd ruin your shoes to cooperate locally in a context where the community will ensure reciprocity, therefore you should send money to far off places which will be spent by the fargroup, who is overwhelmingly unlikely to reciprocate." Uh, obviously not? This is a bad argument in general before we look at any of the specifics about the people we're subsidizing. And making it publicly is a beacon which attracts the sharks.

"But if everyone cooperated every turn, everyone would be better off over the long haul!" That is not causally entangled with your trading partner's decision function. And saying that out loud is going to attract the sort of people who want to value pump you. And lo and behold, EA is filled with sociopaths.

It is deeply ironic that one subculture over, MIRI spends part of their time formalizing when and under what conditions an agent should cooperate in both the IPD and the oneshot. Protip: it isn't CooperateBot and requires conditioning your response on your trading partner.


 >>/6286/
It's true that, given a perfectly selfish utility function that assigns no value to others and only cooperates for game-theoretic reasons, Effective Altruism is bad game theory. But have you considered that "all EAs are perfectly selfish and only care about others insofar as they expect to benefit through reciprocity" might be a faulty premise?


 >>/6289/
Let's take a step back and look at OP's question: they're asking if EA has a higher percentage of sociopaths than the base rate. I am answering that I believe it does and am giving an explanation of the mechanics of why.

I posit that the large presence of CooperateBots attracts the sociopaths. You have a group of people who are easy to exploit in one place, loudly signaling their self-sacrificing ideology. I'm sure their self-conception of their ideology doesn't involve any game theory, nor do they have a selfish bone in their bodies! This doesn't make them not prey, since they're playing the IPD with everyone around them, whether they acknowledge it or not. Their behavior attracts the sociopaths, who gleefully value pump the free undefended resources. The point of reciprocity is that it is a defense against bad actors; it is a defense against being value pumped.

If there's any comfort, EA is starting to see enough sociopaths that the hawks are now starting to fight amongst themselves due to a declining population of doves. OP noticed.

 >>/6290/
This is true and I believe this, but I should have left it out of my post since it distracted from my main thesis.

 >>/6286/

Doesn't this also explain the extreme sociopathic tendencies in the leadership of communist parties, albeit they're worse because commies explicitly justify murders within their group in the name of vague ideological purity and far-off utilitarian heaven? Been reading up on early commies, Bolsheviks, Revolution, Red Terror and subsequent Great Purge lately, the dynamic seems comparable to what you posit. And this guy https://en.wikipedia.org/wiki/Sergey_Nechayev#Catechism_of_a_Revolutionary is basically an exemplary altruistic consequentialist.

Peter Singer promotes it and his manner of thinking disgusts me so I will never donate money to the nogs now.

EA is a big signalling activity: "I'm both smart AND charitable! I actually care"

Just do it in private and don't tell anyone. It's like girls on tinder and instagram teaching English to jungle gooks just for the photo op. See fb page: "humanitarians of instagram"

Doesn't do much for the image of Rats trying to create a religion either.


Everyone wants to tell me how bad the EA movement is.  Leftists think it is hopelessly right-wing.  Rightists think it is hopelessly left-wing.  Some say EAs are too altruistic.  Others say EAs are too selfish.  Some say EA is too elitist.  Others say EA has dumbed itself down too much.  Personally, almost all my experiences have been good, and I'm happy the movement has proceeded the way it has.

I'll bet the reason you hear so many EA horror stories is not because EA is especially full of horror stories relative to average.  Rather, I'll bet that you are "EA-adjacent" in a way that causes you to hear all the juiciest gossip without hearing about everything good in EA.

BTW, for most of the things you mentioned, I can think of an incident or two which fits the description and in my opinion none of them are a huge deal except maybe the burnout rates thing.  In the sense that yeah, the juicy gossip version of X might have characterized it in the way that you're characterizing it, but there are other frames that make it sound more reasonable.  Tumblr makes drama out of everything doesn't it?

Anyway, maybe I'm just out of touch but I can't say a single EA sociopath is coming to mind off the top of my head, and I've been to multiple EA Globals.


 >>/6353/
Those don't necessarily contradict each other - you've got a movement with a lot of libertarians, aimed at helping poor people. People make significant personal sacrifices that are ultimately just as driven by signaling as everything else, but perhaps more visibly so because of poor social skills. And maybe people dumb it down in an attempt to be more accessible without properly tackling elitism.
But I'm not involved in any EA communities so I wouldn't really know.


 >>/6348/
It doesn't deserve to be popular.
Rats and adjacents by way of their rationalist outlooks absolutely suck at aesthetic, holistic and passion-driven things.

Tainted good isn't the same as untainted good. Surely you see the stratification.

 >>/6365/
Is this actually the case? Plenty of people  who would be rat-adj today had decent to great aesthetics, like some golden age science fiction authors. At the same time even the most anti-rationality blue tribe members suffer from bad aesthetics similar to those of Bay Area rats. I am thus inclined to blame blue tribe memes for the aesthetics problem and not assume it is inherent. You can't make an awe inspiring religion substitute out of the stuff, which is why Humanism and EA are so stale.


Post(s) action:


Moderation Help
Scope:
Duration: Days

Ban Type:


22 replies | 2 file
New Reply on thread #6273
Max 20 files0 B total