• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Artificial Intelligence and Life Beyond Death

The "I Am" is experienced by a highly specific material receptacle - namely ourselves. It is entirely meaningless without the feeling of being alive which is the product of that specific material receptacle and its life history.

If a person's mind (with all memories) were somehow transferred to "another human body with a blank brain" or a computer I would think that it would experience continual confusion and possibly psychosis because it is not contained in own original body nor does any of its remembered history match its current status.


Perhaps, well food for thought anyway.
 
One day it may be possible to download your memories, personality, and feeling of self-awareness into a machine but that would be a copy of you, not you. Death is final. You are annihilated. No more you ever.


Yes, well I have thought of this as I posted before.

Simultaneous downloading and original deletion may be the answer.
 
So how can we say that in this process of replacement, no step is experientially equivalent to making such a copy? You die and instantly a new thing believes it has been you all along.

Because whatever you are, you're not a single neuron, you're the process that is happening across your entire brain, or at least a large subset of it. If that's true then as long as that process is maintained, there's no way that replacing individual neurons can replace you with a copy.

I wouldn't step into a transporter, but I don't see a problem with replacement spread out over time. One interesting things about this scenario is that it can't lead to multiple copies, which makes sense if the final thing is the original and not a copy.
 
Is it possible we may be able to transfer this sense of "I Am" to another material receptacle? This could be another being or perhaps a computer.


Sure. It won't actually be me, but I see no reason we couldn't make a machine eventually that thinks it's me.
 
Last edited:
Because whatever you are, you're not a single neuron, you're the process that is happening across your entire brain, or at least a large subset of it. If that's true then as long as that process is maintained, there's no way that replacing individual neurons can replace you with a copy.

I wouldn't step into a transporter, but I don't see a problem with replacement spread out over time. One interesting things about this scenario is that it can't lead to multiple copies, which makes sense if the final thing is the original and not a copy.

So what's the role of time here?

I have the same instinct that replacing some arbitrary small percentage of my physical brain bit by bit seems to preserve that idea of the self as a continuous pattern. In the same way that we replace the cells of our body gradually.

But something about the role of time here seems a bit counterintuitive. Is there a pace of replacement which would break this continuity?

And the role of the units replaced seem arbitrary also. We can both agree that replacing one neuron preserves you, and replacing the whole brain in one go does not. What percentage of the brain can be replaced in a single action to preserve self, and why that particular amount? It's a Sorites paradox situation. If we set a limit, then we need a good reason to say one neuron more than that is too much. And if we can't say that and there's no good reason to draw a line at one nueron more, than why not one neuron more than that?

When we draw such lines in naming conventions (Like how many grains of sand make a "heap") or in laws (Exactly what age should be the age of consent) we know we're being at least a little arbitrary for practical reasons. But if the question is "Where is the line that makes you truly yourself in terms of brain replacement" then the line isn't just a practical arbitrary tool, it's saying something important about identity. Is there a gradient? Starting at X percentage of your brain replaced you're a certain percentage less "you" scaling up as you take larger chunks? That seems counterintuitive because whether your experience is a continuous pattern should be a binary fact, no? You can't be sorta kinda continuous, or sorta kinda yourself. We don't even have any tools to possibly measure that.

Because, as a limit, the end result is physically identical to just making an artificial copy and plopping it in your skull. I think we mostly agree that doing that would not preserve the continuous pattern of "you". So it's something about this intermediary stage that's supposed to preserve "you".

What if, instead of days or months between operations, the procedure was enacted by some sort of nanobots. Is there some period of time that might serve as a hard line? If your whole brain switches over in the time it takes to snap, has continuity been preserved?

I can feel the time objection may seem less strong that the size of replacement at a time issue, but let me add a wrinkle.

What if, those clever nanobots, instead of destroying the old brain cells, carried them away from your body and reassembled them in their original configuration. Maybe within a robot body. Now we have your original organic brain and this new artificial brain, both of them with all your memories and feelings, each thinking it's you. Are they both you now? Is the original reassembled organic brain less you because there was a microsecond gap between it being dissassembled and put back together? You seemed pretty comfortable saying that the artificial brain replaced part by part was you for any meaningful purpose. Does that mean your real brain is now the "copy" and it's recollections and experience are not continuous from yours? Which body are "you" in?

See, replacement doesn't actually free us from the copy issue. In fact, the whole transporter thought experiment comes from the Ship of Theseus which started as a question about replacement part by part and even included the question of what if the original parts were reassembled.
 
So what's the role of time here?
The role of time is that you and I are continuous processes.

I have the same instinct that replacing some arbitrary small percentage of my physical brain bit by bit seems to preserve that idea of the self as a continuous pattern. In the same way that we replace the cells of our body gradually.

But something about the role of time here seems a bit counterintuitive. Is there a pace of replacement which would break this continuity?
Yes, any pace which happens faster than the processes of the brain such that it interrupts those processes enough to interrupt consciousness. Whatever consciousness is, it's happening across many neurons simultaneously. Interrupting a few of them doesn't stop the process from happening (so one neuron could die and you'd still be you), but replace too many at once, or too quickly, and that process wasn't conscious for that period. You've thus interrupted the continuous nature of consciousness.

And the role of the units replaced seem arbitrary also. We can both agree that replacing one neuron preserves you, and replacing the whole brain in one go does not. What percentage of the brain can be replaced in a single action to preserve self, and why that particular amount? It's a Sorites paradox situation. If we set a limit, then we need a good reason to say one neuron more than that is too much. And if we can't say that and there's no good reason to draw a line at one nueron more, than why not one neuron more than that?
I think the above addresses that issue as well.

When we draw such lines in naming conventions (Like how many grains of sand make a "heap") or in laws (Exactly what age should be the age of consent) we know we're being at least a little arbitrary for practical reasons. But if the question is "Where is the line that makes you truly yourself in terms of brain replacement" then the line isn't just a practical arbitrary tool, it's saying something important about identity. Is there a gradient? Starting at X percentage of your brain replaced you're a certain percentage less "you" scaling up as you take larger chunks? That seems counterintuitive because whether your experience is a continuous pattern should be a binary fact, no? You can't be sorta kinda continuous, or sorta kinda yourself. We don't even have any tools to possibly measure that.
This is, I think, a somewhat valid concern. But like all things I think there's a continuum here. When one neuron dies, you change slightly. If you consider the self to be not a single entity but an assemblage of smaller entities, then there can be some small unit of change that doesn't affect much and some larger unit of change that part of that conscious entity behind but not the whole thing. There is some evidence (see split brain patients for instance) that we are made up of multiple consciousnesses.

Because, as a limit, the end result is physically identical to just making an artificial copy and plopping it in your skull. I think we mostly agree that doing that would not preserve the continuous pattern of "you". So it's something about this intermediary stage that's supposed to preserve "you".
Yes, but it may not be a binary difference.

What if, instead of days or months between operations, the procedure was enacted by some sort of nanobots. Is there some period of time that might serve as a hard line? If your whole brain switches over in the time it takes to snap, has continuity been preserved?
I think, again, it would have to happen at a time scale longer than the time scale of conscious awareness, because that's the time scale over which the whole large scale processes of the brain are interacting. So I would probably feel safe if it happened over the course of hours, but not seconds.

I can feel the time objection may seem less strong that the size of replacement at a time issue, but let me add a wrinkle.

What if, those clever nanobots, instead of destroying the old brain cells, carried them away from your body and reassembled them in their original configuration. Maybe within a robot body. Now we have your original organic brain and this new artificial brain, both of them with all your memories and feelings, each thinking it's you. Are they both you now? Is the original reassembled organic brain less you because there was a microsecond gap between it being dissassembled and put back together? You seemed pretty comfortable saying that the artificial brain replaced part by part was you for any meaningful purpose. Does that mean your real brain is now the "copy" and it's recollections and experience are not continuous from yours? Which body are "you" in?
I think it's pretty straightforward that the brain reconstructed from the physical pieces of your brain would be a copy whereas the one that maintained the continuous operation of the process of your brain would be the original.

See, replacement doesn't actually free us from the copy issue. In fact, the whole transporter thought experiment comes from the Ship of Theseus which started as a question about replacement part by part and even included the question of what if the original parts were reassembled.
I certainly agree that none of this is obvious. Probably many people will disagree with my viewpoint, but it seems clear to me and consistent with how I view the brain and the nature of the self.
 
Sure. It won't actually be me, but I see no reason we couldn't make a machine eventually that thinks it's me.
This reminds me of those endless arguments we had with Jabba, I'm half expecting him to turn up and suggest you would be looking out of two sets of eyes.
 
I think it's pretty straightforward that the brain reconstructed from the physical pieces of your brain would be a copy whereas the one that maintained the continuous operation of the process of your brain would be the original.
.

I think that's true if you go all in for a certain idea of continuity, but it leads to some pretty strange places.

Take this example:
Nanobots dissassemble your brain, neuron by neuron, instantly replacing each neuron with an electronic nancomputer. As they do this, they're reassembling the displaced neurons back into a separate working brain. The whole process takes about half a second.

Exactly when the process is done, the new, 100% computerized brain is inserted into a robot body, and the old brain is returned into your original biological body.

From the perspective of the biological brain, you've lived your whole life, you blink, and now there's a robot copy standing next to you. It seems a little absurd to say that the being whose every cell is yours, which experienced your whole life is now the copy, and the complete robot is the continuous "you".

Or to simplify even further, we can just focus on disrupting continuity. What if we sent those nanobots swarming all over your brain, and instead of taking it apart, they just actively paused all activity for that same half a second, and then allowed it to restart. Is that a total death of the self? Is the person who resumes after that half second, who again, thinks they just heard a snap and looked around to see everything is normal, not really "you"?

If disrupting continuity for a brief period of time DOESN'T end our continuous experience, then it in that first example, it's harder to justify that the robot is you and the biological brain isn't. And if that's the case, we're back to it being hard to justify that a robot can be "you".
 
Yep, I agree that those are implications of this viewpoint. It also makes one worry about think sleep and anaesthesia...
 
I've used this sequence of hypothetical events in previous discussions of the philosophical teletransporter. At what point are you not still you afterward?

1. You go to sleep and wake back up.

2. You are knocked unconscious by a concussion, then fully recover.

3. You're exposed to a toxin that completely suppresses all your nerve impulses for two hours. You survive due to being on a heart-lung machine at the time, and you recover.

4. You're put into complete stasis for a week, by some combination of deep freezing and additional measures to prevent cell damage due to the freezing. (Or imagine some sort of Star Trek style stasis field, that fixes each individual atom within it in place, without actually stopping time.) Then you are revived.

5. Same as 4, but while you are otherwise in complete stasis, a machine removes each individual atom in your entire body, including your brain, one at a time (albeit very rapidly, in order to finish in time) and reassembles those same atoms with the same molecular bonds in the exact same relative positions, in another stasis chamber ten meters away.

6. Same as 5, but the reassembler doesn't keep your original atoms. It takes an atom of the same element from a stockpile and substitutes it for each of your original atoms, while the disassembler just tosses the old ones into a random heap to be sorted out later.

7. Like 5 and 6 combined. The original atoms are reassembled in one chamber, while a materially identical duplicate made of feedstock atoms is assembled in another one.

7A. When revived, you're told you were reassembled from your original atoms, and the duplicate has been sent off to a slave colony. Thanks, here's your 10,000 credit fee.

7B. When revived, you're told you're the duplicate, and the original has been killed attempting a dangerous mission. Anticipating that likelihood, you were kept in stasis as a back-up. (Don't worry, you don't have to now attempt the mission yourself. Though, if you'd like to volunteer, you can take the precaution of having another duplicate made first...)

7C. When revived, you're told you're the duplicate. You were kept in stasis while the original lived out their entire remaining lifetime. Now that they're deceased, you're the heir. Of course it might be a bit tough catching up on the decades you missed. If you'd like, you can have the memories of that previous lifetime implanted... but that's handled by a different branch of the Philosophy Department.
 
always me.

If there is no functional difference between copy and original, then for all purposes they are all copies.
 
always me.

If there is no functional difference between copy and original, then for all purposes they are all copies.

So if we made a perfect copy of you in a factory in Houston, handed it a briefcase full of a million dollars and incinerated the current you in your sleep, you would have no objection?
 
So if we made a perfect copy of you in a factory in Houston, handed it a briefcase full of a million dollars and incinerated the current you in your sleep, you would have no objection?

Of course I would - they would have just killed my identical twin!
What a waste! Why make a copy if you don't want to have two instead of one?

But the point is that the copy would be me, just like the original.
 
always me.

If there is no functional difference between copy and original, then for all purposes they are all copies.

I see it differently.

A perfect clone of me would not be me. I'm not looking out of two sets of eyes.

Rather, we have two individual human beings, sharing the same state (prior to divergence, at least), but otherwise independent of each other. More like twins - my twin isn't me, even though at one point we shared the same state, and even emerged from the same cluster of cells.

If I show up at my job and get paid, my clone can't do the same. When I eat, the food goes in my belly, not my clone's belly. My clone can't survive by virtue of being identical to me. They still have to go out and get their own job, provide their own food and shelter, make their own way in the world.

And the best part about all of this is that if my clone is a truly faithful reproduction of my mental state, then he would agree that even though we have the same memories, and have the same sense of self-identity, we are actually not the same person, but rather two people with separate burdens of survival to carry, and separate identities in the world.
 
I see it differently.

A perfect clone of me would not be me. I'm not looking out of two sets of eyes.

Rather, we have two individual human beings, sharing the same state (prior to divergence, at least), but otherwise independent of each other. More like twins - my twin isn't me, even though at one point we shared the same state, and even emerged from the same cluster of cells.

If I show up at my job and get paid, my clone can't do the same. When I eat, the food goes in my belly, not my clone's belly. My clone can't survive by virtue of being identical to me. They still have to go out and get their own job, provide their own food and shelter, make their own way in the world.

And the best part about all of this is that if my clone is a truly faithful reproduction of my mental state, then he would agree that even though we have the same memories, and have the same sense of self-identity, we are actually not the same person, but rather two people with separate burdens of survival to carry, and separate identities in the world.

you are almost right: it's not a question of you or the clone, it is a question of one or the other one.

Your boss and family don't know the difference, but how the environment reacts would be the only possible difference after the duplication.

Thinking that there is something special about the original when there are identical copies smells a lot like talking about souls to me.
 
Of course I would - they would have just killed my identical twin!
What a waste! Why make a copy if you don't want to have two instead of one?

But the point is that the copy would be me, just like the original.

Can we try another one?

Congratulations! You've already been cloned! Since your clone was just created, he has a different legal name and social security number. Otherwise, absolutely identical at the time of the copy with all memories etc.

Because your clone was just created, it has no possessions, no money, nothing.

I'm not sure if you personally have money or debt, but for the sake of the experiment, imagine you have a modest amount of money in the bank. A life savings.

Now if there is no difference between the you you are experiencing and the cloned you that you have never seen, no meaning to distinguishing between them, then transferring your life savings to your clone would be a lateral move right? Right now there are two you's, one with money, one without. Switching which instance of you has money makes no difference, so you should have no objection to sending all of your money to the other cory, right?

Imagine, fo the sake of the experiment, that you can never meet this copy, share resources or money in other ways etc etc.

Do you have zero preference between keeping all of your money or giving it to the other you?
 

Back
Top Bottom