/ratanon/ - Rationalists Anonymous

Remember when /ratanon/ was good?


New Reply on thread #6299
X
Max 20 files0 B total
[New Reply]

[Index] [Catalog] [Banners] [Logs]
Posting mode: Reply [Return]


thumbnail of limbic_system.jpg
thumbnail of limbic_system.jpg
limbic_system jpg
(117 KB, 300x300)
Beyond easily verifiable things, like how much toothpaste I have left or how fast objects fall, why should I even bother believing anything? The outside view is that there are people smarter than me that believe every ideology under the Sun, many mutually exclusive, and can effectively and in good faith support their beliefs. I can't possibly agree with them all, and even if I were somehow able to evaluate every argument completely impartially, I doubt I would be able to determine which are true.

I have a worldview, and my beliefs feel real and true from the inside. But this was equally true when I believed in Catholicism and when I believed in the necessity of communism. Since age 17, at any point in my life, I would consider the two-year-prior version of myself cringe and {wrong}pilled. If I look at the path my beliefs have followed, it looks more like a random walk through some high-dimensional political compass than me converging on the truth. How am I supposed to take myself seriously?

The content of most "worldview" type beliefs seems largely irrelevant to my Gnon-given purpose of replication. They're there both to keep my conscious mind invested in an ultimately meaningless existence, and to help me credibly signal allegiance to the proper groups. It feels like I choose what I believe, but it's exactly as illusory as free will, and for the same reason. I know that my subconscious is pulling my strings, rewarding paths of thought that it deems useful and punishing those that risk my social standing. I can feel the flow of dopamine when I fit new evidence to my worldview. I can feel it cut off when I spend too long evaluating the outgroup's beliefs for truth content.

What am I to do? My monkey brain is probably better at winning social games than my analytical mind, so I think I should just stop resisting its guidance with this autistic desire to believe what's actually true. I guess this is giving up on the idea of epistemic rationality, but, like, at least I'm doing it on purpose.
You don't need to throw out all of epistemics. You need to be wary of ideology. Wrong ideological opinions are caused less by an impaired ability to find the truth than by bad incentives.

Even if communism is fundamentally correct, being communister and less willing to accept conventional economics than the next guy marks you as a more reliable ally to the cause.

Even if climate change is real and catastrophically dangerous, being more alarmist than the next guy will get you more attention.

Even if Python is the best general-purpose programming language today, playing down its performance problems suggests you're better at writing fast Python code.

Even if the rationalist approach to epistemics is correct and useful, overusing terms like "Chesterton's fence" and "planning fallacy" makes you sound rationalister.

So what do you do? Well, you may have to ignore some of the most heavily politicized areas of knowledge. But keep in mind that they seem more important than they really are, because they're politicized. They're overrepresented. They make up a smaller share of interesting uncertain knowledge than you'd intuitively think.

Giving into social incentives for these areas is ok. You're going to do it whether you resist or not, and maybe doing it deliberately even makes it easier to occasionally be aware that you're biased.

If you want to get accurate knowledge, try topics that aren't very politicized, and try to find people who are primarily rewarded for accurate predictions.

Prediction markets are alright to trust. A lot of scientific research is fine, or can at least be expected to be closer to correctness than your baseline. A lot of it isn't, but if useful, verifiable, true predictions are normal in the sub-area then it seems good.

I think there are a lot of areas that are harder and more interesting than how much toothpaste you have left, but aren't ruined beyond tractability by bad incentives.

It's not easy, but it's not homogenously hard. Only give up on the really hard stuff.

 >>/6300/
> Even if Python is the best general-purpose programming language today, playing down its performance problems suggests you're better at writing fast Python code.
I don't think this is how signalling works among programmers. Downplaying Python's performance problems tells me that you probably haven't used other languages enough and are defending Python either out of ignorance or to save face. A good signal is more like "Python is slow as shit, but we make it work". There, now I'm programmerer than you.



Post(s) action:


Moderation Help
Scope:
Duration: Days

Ban Type:


3 replies | 1 file
New Reply on thread #6299
Max 20 files0 B total