/ratanon/ - Rationalists Anonymous

Remember when /ratanon/ was good?


New Reply on thread #6406
X
Max 20 files0 B total
[New Reply]

[Index] [Catalog] [Banners] [Logs]
Posting mode: Reply [Return]


thumbnail of 4ad.jpg
thumbnail of 4ad.jpg
4ad jpg
(58.09 KB, 540x960)
CMV: Suicide is a rational choice for many people

Life has no meaning.
Due to that the only thing that really matters is the quality of the journey. Do you enjoy the Dance/Game of life.
If most of the time Life is shitty for you (High amounts of suffering), even if nothing terrible really happens. You're probably better off dead. Why would you like to experience a shitty time?

Possible counterarguments – answers
Not everybody feel more suffering than joy in their life – It's true, but many do. I would assume > 10%. Many of these people choose to live only due to fear of death, or to serve ideologies, religions or other spooks. They are being used by the system just like animals that are bred for their meat. It doesn't mean their life worth experiencing.

Even if it's rational to commit suicide, it's not easy or sometimes even possible, Survival instinct is a bitch – I agree, My argument only says that's the "rational" thing to do, the one that makes sense. Not that's it's easy or always possible.

Some people live for other people – Agreed again, if you have dependents (like small children) maybe it's better you won't kill yourself and leave them alone. Your friends/parents/wife are also a reasonable reason, but weaker IMO - if they really love you they should understand that death might be better for you.

Utilitarianism is retarded, ergo, all arguments stemming from utilitarian calculus are invalid.

Having said that, I concede that value of life may have some relevance to the issue of suicide. So, here goes:

1. A person in a given moment has rights equal to those of other instances of this person in time, i.e. himself at other moments.
2. Sometimes even very unhappy people strongly prefer to exist, not in the sense that they fear death but rather that they consider life worthwhile.
3. Killing yourself when unhappy automatically kills those happy instances by precluding their existence. 
4. It is immoral to kill people who have a preference to exist. It is, however, not immoral to deny people death when they don't have such preference. 

>  Why would you like to experience a shitty time?
This isn't a kino, degenerate.



 >>/6413/
This. In the event that I want to kill myself, I intend to rig myself up with explosives that trigger based on some custom signal. Then I text some cute girl "be my gf" or similar, and I write a piece of software that prereads her response, performs sentiment analysis, and triggers the explosives to instantly terminate my conscious experience if it isn't a positive response. If I survive, I've figured out how to basically become the god of my own little universe and I get to have sex with infinity cute girls and maybe no longer feel suicidal as a result. If it doesn't, well I wanted to be dead anyway in this scenario.

 >>/6412/
It's also symmetrically immoral to force instances of yourself to suffer as it is to prevent from them to exist and have joy, and if you'll believe Benatar asymmetry it's even worse.

Besides, it's not a utilitarian view - morals is social-subjective BS anyway. It's from a completely selfish hedonistic view.

 >>/6422/
Before you go through with this, consider that you going through with it would make non-negligible the probability that you're a character, maybe even the protagonist, in Scott's short story. Which means your attempt would go wrong in some horrible but clever way. Trying to enumerate the horrible but clever ways it could go wrong before you go through with it would, of course, substantially raise the probability of you being such a character.

 >>/6427/
>  It's also symmetrically immoral to force instances of yourself to suffer as it is to prevent from them to exist and have joy

I disagree, what are you gonna do? Hedonism boils down to utilitarianism, it's just a feeble-minded autistic American model where utility is conveniently put on the scale from negative to positive infinity. But qualitative differences do not form a linear scale.





 >>/6428/
It's impossible to be a character in a story. Characters in stories have no agency or internal experience; they're just words on a page.

I know that's fairly obvious, but we are a bunch inclined towards missing the forest for the trees.

What you'd more realistically be concerned about is that you're in some kind of simulation. Which, yes, could be a simulation that's a post-singularity medium for narrative fiction, in which you're a protagonist in one of our caliph's short stories. But if it's actually Scott, if he's somehow ended up writing in a medium in which the main characters are sentient he's not going to write those characters a Bad End.

But the actual realistic outcome is much more quotidian; she says yes, but either she didn't realise who the text was from and thought it was from Chad, or she's just lying to "prank" you. You have to rule out things going wrong in dumb ways before you're entitled to have them go wrong in clever ways.

 >>/6515/
You can acausally determine the behavior of characters in stories based on your own behavior. If you have behavior X, and that is because rationalisty people tend to have behavior X, and Scott is writing a story about a rationalisty person, then your decision to have behavior X makes the person in the story more likely to have behavior X.
The expected number of actual people trying the quantum billionaire trick is much smaller than the expected number of fictional people written by Scott trying it, so if you value the results of your decision system in fictional situations even a little bit (for example because fiction affects opinions) it may be rational to behave as if you're in a Scott story.

 >>/6516/
> The expected number of actual people trying the quantum billionaire trick is much smaller than the expected number of fictional people written by Scott trying it
Scott's only one man, and he won't likely be making the same kind of rationalist-y fiction you can hypothesise about by knowing Scott's writing  decades from now, and he hasn't even written on the quantum billionaire trick, so the expectation that gives me is <1. What kind of assumptions are you making to get expectations that make a Pascal's-Mugging-adjacent "even a little bit" argument viable?


 >>/6545/
I think it's very unlikely that even one person is going to try the quantum billionaire trick, at least until ems are invented.
If you find yourself seriously considering it then you're arguably pretty likely to be a fictional character. If you care in the abstract about how fictional rationalisty people are portrayed, then you should take that into consideration, because there's a decent chance you'll influence that.
This consideration still isn't as important as the possibility of dying and/or becoming a billionaire, but it might be enough to change some specifics about the way you execute it, or be a tiebreaker.
It's not likely*important enough to be an imperative, but it's likely enough to take into consideration.

 >>/6548/
I was completely serious about  >>/6422/, other than implementation specifics, and I'm not a fictional character as far as I can tell. I'm not suicidal, but in general I have a hard time believing the set of people who (believe they) have nothing to lose, from now until the invention of ems, contains 0 people who consider quantum suicide trickery worth messing with.

Also your whole argument still relies on fictional characters having a conscious experience. My intuition is that it would be wasteful to simulate a universe with this level of complexity simply for storytelling, so it would be relatively easy to identify your world as created for narrative purposes from the inside. Our universe would need to meet that criteria before I can bother worrying about whether I'm in a Scott masterwork or some plebeian mass entertainment.

 >>/6549/
> Also your whole argument still relies on fictional characters having a conscious experience.
It doesn't. By being the kind of person who would consider the possibility of being in a story, a fictional character based on people like you will be more likely to consider the possibility of being in a story.
That's the beauty of acausal decision theory. You can acausally influence the behavior of your simulation, often even if the simulation is insufficiently detailed to be conscious.


Post(s) action:


Moderation Help
Scope:
Duration: Days

Ban Type:


19 replies | 1 file
New Reply on thread #6406
Max 20 files0 B total