Sam and Finley
Finley loves Sam and Sam loves Finley. Sam hits their head, losing all of their autobiographical memory. However their skills and personality remain as they were. The usual question at this juncture is were you Finley, would you still love Sam? The idea being that if you answer yes, it follows that at least on an emotional level, you believe that Sam is still Sam. Thus, if your feelings are right, continuity of autobiographical memory is not required for continuity of personhood.
I want to ask a different question. It is harder to answer, and perhaps less philosophically illuminating, but still interesting. Suppose that you are Sam instead of Finley. You wake up and the concerned nurse explains to you many things- among them that you have a devoted partner that comes and visits every day. She gives some details of your life together.
My two questions are:
1. Do you think that, in this situation, you would immediately, or almost immediately, feel love for Finley- and not just the love you might feel for any kind stranger, but the love of a partner for a partner? If you wouldn’t immediately feel love, how quickly do you think it might develop? How likely would it be to develop?
2. Regardless of your answer to the above, do you think you would be obliged to “try to love” Finley. Does the concept of trying to love someone even make sense?
The debate at the end of time
Everyone who has ever died has been raised from the dead in new and immortal bodies. Maybe your resurrectors used some of the technological options I discussed in this article, or maybe they used supernatural power, it doesn’t matter.
Your resurrectors explain that there is an important quandary- what should be done with the great wrongdoers of history? These have been raised along side the rest of you. Does Idi Amin deserve an eternity in paradise, should Temujin break bread at the seats of the blessed? They have decided to leave these questions to a democratic decision of every human who has ever lived. A great debate begins, some arguing they should be absolved with everyone else, some arguing they imprisoned for a time, some arguing they should be killed(1), and some arguing for even worse. Who, if anyone, among the resurrected should be punished? How severely should they be punished if at all? How far does this go down the chain of wrongdoing? Should ordinary murderers be punished? Fraudsters? Those guilty of assault? Regardless of what the resurrected as a whole would accept, would you want cultural context to be accepted as a defense? As a mitigating factor? Would the sufferings the wrong doers endured in their own lives count as “time-served”? It falls upon you to take a stance on all these questions, or if not, justify your abstention.
Imagine a world where each body contains two persons each with a very separate personality. Which of these personalities is in the driving seat changes frequently. It’s very common for one personality to be cruel while the other is kind, or honest where the other is deceitful.
How would you deal with punishment and criminal justice in this world, given that punishing a guilty person inevitably also punishes an innocent? What aspects would ethics require us to change? Assume you have a similar level of resourcing to a very well resourced penal system today. Would you try to make prison abolition work? Would you reluctantly accept prisons, but try to greatly minimize their use?
(1) Assume there is some way to do this despite their immortal bodies
5 thoughts on “New thought experiments for the backyard metaphysician to try at home”
Re: “Double trouble”
I have had my share of encounters with a batshit crazy cult targeting Friendly AI researchers whose dogma includes exactly that.
We live in a crazy world.
Obiter dicta, I’m a pro-friendly AI research guy, but is there an argument that the fact a cult has formed around it an unsavoury sign? I find myself thinking about AI research in a religiously charged way, and it’s a mental trap I’m not sure how to beat.
I’m not sure it is even a mental trap!
One of my friends likes to argue that we need to treat Extropy/”rationality” as a religion to be able to tackle x-risk:
One more person in Ziz’s orbit recently died, likely by suicide (though I don’t have that confirmed) – Jay.
I believe they were driven to it by the same sort of escalation of inner conflict between “Good and No good ‘Hemispere'” as Maia Pasek. I think at a minimum it would be worth adding as one more life lost on zizians.info…
I would be interested in talking with you. I’m at the attached email if you are OK with contact.