/ratanon/ - Rationalists Anonymous

Remember when /ratanon/ was good?


New Reply on thread #5503
X
Max 20 files0 B total
[New Reply]

[Index] [Catalog] [Banners] [Logs]
Posting mode: Reply [Return]


thumbnail of aed09849dc12981cd348297dd974a107.png
thumbnail of aed09849dc12981cd348297dd974a107.png
aed09849dc12981cd3482... png
(23.36 KB, 240x240)
Polyamory = good, because it's just your obsolete primate programming that makes you upset about Tyrone fucking your gf
EA = good, because your enlightened primate programming makes you care about maximising starving n*****s in Africa.
Am I missing something here or there is a contradiction?


 >>/5505/
Certain primate heuristics are the direct cause of most things we do, but they're not usually considered goals, and they tend to conflict with other primate heuristics in some way.
The terminal goal of EA is roughly to increase global utility as much as possible, i.e. utilitarianism.
A more complicated but not much more useful explanation is that EA communities explicitly value utilitarianism and doing things that are more utilitarian is a way to increase social status in those communities.
So if you want to get all technical and reductionist it's caused by the drive for social status which the group is collectively harnessing for the purpose of increasing global utility. But that way of framing things has limited use. In most contexts it's useful to skip all the parts with the words "status" and "signaling".

But that's getting away from the topic of the thread.
The reason polyamory is considered good by a lot of rationalists isn't that it suppresses obsolete primate programming, it's that it (presumably) makes the people involved happier. Suppressing primate heuristics is instrumental, it's not the goal.
The reason EA is considered good by a lot of rationalists is cynically that it lets you appear rational and ethical, or less cynically that it increases global happiness.


 >>/5507/
Some utilitarians would be on board with that. Some utilitarians think that doesn't maximize utility (e.g. preference utilitarians) or even happiness (e.g. certain objections about unique brain states).
There exist EA people who would be on board with that. I don't know how many. I would think people who think tiling the universe with hedonium is good wouldn't object to wireheading either.
There also exist EA people who aren't utilitarians, even though the overall sensibilities of EA are pretty utilitarian. I don't know how many.
Most EA has much more mundane goals. Preventing malaria or improving nutritional intake of poor rural laborers using low-cost interventions is good whether you subscribe to preference utilitarianism or negative utilitarianism or hedonic utilitarianism.


 >>/5504/
> Obeying them or getting rid of them is always instrumental, never a terminal goal.
Why not? Did the Terminal Goal Fairy come down and bless your dear monkey brain with some sort of exogenous goals?

 >>/5506/
> The reason polyamory is considered good by a lot of rationalists isn't that it suppresses obsolete primate programming, it's that it (presumably) makes the people involved happier.
So it's not good because it suppresses the obsolete primate desire to pair bond, sure, but it is good because it achieves the primate desire to experience happiness. Really makes me think. And what I'm thinking is that you haven't thought this through all the way. You never answered the other anon's question, "What are some terminal goals that are not primate heuristics?"

 >>/5509/
Tyrone will copulate with whomever he damn well pleases.

 >>/5510/
> Why not? Did the Terminal Goal Fairy come down and bless your dear monkey brain with some sort of exogenous goals?
In explicit justifications rationalists give, like the OP seems to be criticizing, it's an instrumental goal. In the first case it's not "this is primate programming so you should get rid of it" but "this is primate programming so it's okay to get rid of it to achieve X".
> You never answered the other anon's question, "What are some terminal goals that are not primate heuristics?"
That's what the first paragraph was about. Explicitly stated terminal goals are not always primate heuristics. Direct causes are primate heuristics, but talking in terms of those is usually not useful.



Post(s) action:


Moderation Help
Scope:
Duration: Days

Ban Type:


9 replies | 1 file
New Reply on thread #5503
Max 20 files0 B total