• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Artificial Intelligence thinks mushroom is a pretzel

I don't think anyone claimed that two beings are one.

But you still don't fully realize the implication there. BOTH are equally copies of the original Riker that was transported. (Who in turn was a copy of a copy of a copy of a copy.) Yet nobody has any problem treating one of them as the "real" Riker.

The other Riker isn't UNREAL. That's the point. Killing the copy is still murder even though you have "a" Riker left afterward.
 
Except from the POV of the Riker you just killed. Once you concede they are two separate beings you have granted then both personhood, and it matters very much to a person whether it exists or not. One being's similarity to another doesn't render it a nonbeing.

Of course, but if you don't kill one of them, (or leave him stranded on a planet for 8 years to give you a half-arsed reason in the script to regard them as different) then how do you divide up their stuff and who gets to be Picard's number one?
 
Of course, but if you don't kill one of them, (or leave him stranded on a planet for 8 years to give you a half-arsed reason in the script to regard them as different) then how do you divide up their stuff and who gets to be Picard's number one?

That wasn't an issue, even for the Rikers themselves, as the incident that created a duplicate happened long before one of them wound up on the Enterprise. Tom only had claim to possessions and entitlements dating from before the split. He had never even met Picard before.
 
The other Riker isn't UNREAL. That's the point. Killing the copy is still murder even though you have "a" Riker left afterward.

Hmm? I don't think I was saying he's unreal.

And honestly, you seem to be the only one obsessed with killing clones at this point. Is there something you're trying to tell us? :p
 
Hmm? I don't think I was saying he's unreal.

And honestly, you seem to be the only one obsessed with killing clones at this point. Is there something you're trying to tell us? :p

If I had a duplicate we'd get along very well. So well it would disturb other people very much.
 
If I had a duplicate we'd get along very well. So well it would disturb other people very much.


:jaw-dropp



What a disturbing coincidence. Mrs Cheetah said almost the same thing about a week ago. She said that if TragicMonkey had a duplicate they'd get along very well. So well it would disturb other people very much.
 
If I had a duplicate we'd get along very well. So well it would disturb other people very much.

I think most of us would. After all, you have the same interests, topics, etc, as yourself. Humour probably wouldn't work well, though, considering you already know all the jokes you'd tell yourself :p

Regardless, I think the topic of whether you'd kill a clone is entirely orthogonal to the topic of whether having the same synapse configuration means the same identity.
 
I think most of us would. After all, you have the same interests, topics, etc, as yourself. Humour probably wouldn't work well, though, considering you already know all the jokes you'd tell yourself :p

Regardless, I think the topic of whether you'd kill a clone is entirely orthogonal to the topic of whether having the same synapse configuration means the same identity.

I'd bang my clone. I thought that was obvious. I'm super hot.

And not really: the personhood of a duplicate proves its separateness of being. Unless you're dealing with a Trinity situation where you have one being "in three persons" or whatever insanity people argued about for millennia now. Don't send Jesus through the transporter, you'll end up with between one and six beings in two bodies.
 
Right. So we're now down to whether there's a difference between the original and an _identical_ copy. Fine by me, actually.

Well, we're back to where you draw the line. Let's start simple. Are you the same person as you were 30 years ago? Are any of your relatives / friends / loved ones / whatever, the same person they were 30 years ago?

Well, according to Stanford University, it takes about 10 years for almost every cell in your body to have died and been replaced with a new one. Some parts are refreshed much faster, but 10 years is what it takes to be reasonably sure that there's buggerall left of the original. So after 30 years, not only you ARE a copy, but you're actually a copy of a copy of a copy of the original person. Ditto for your wife or whatever.

Does that qualify as a different person?

If no, then what difference does it make? I mean, what functional difference anyway?
Your brain cells are not replaced if they die or are damaged.
As the thread is about AI brains etc, then yes, you are the same person as your brain cells have not been replaced at any point in your life.
 
Your brain cells are not replaced if they die or are damaged.
As the thread is about AI brains etc, then yes, you are the same person as your brain cells have not been replaced at any point in your life.

The cells may not be replaced with new cells, but the molecules that make up the cells must be?
 
Unless you have a holodeck a simulation is still just a simulation. A simulated intelligence would simulate thought, but not actually think.

I'm still curious about this. What is a simulated thought? It seems to me that thought is simulation. A simulation of a simulation is still just a simulation.
 
Are your brain cells real, or simulated?


Your thoughts are the information propagating across the network of your brain cells. Whether that network is made of meat neurons or simulated neurons or electronic neurons does not affect the propagation of information across the network. All that is needed is that the network be functionally equivalent. On what it runs is of no consequence.
 
Last edited:
Your thoughts are the information propagating across the network of your brain cells. Whether that network is made of meat neurons or simulated neurons or electronic neurons does not affect the propagation of information across the network. All that is needed is that the network be functionally equivalent. On what it runs is of no consequence.

This.
 
Your thoughts are the information propagating across the network of your brain cells. Whether that network is made of meat neurons or simulated neurons or electronic neurons does not affect the propagation of information across the network. All that is needed is that the network be functionally equivalent. On what it runs is of no consequence.

In which case you can prove it by construction a thinking non-brain. Do you believe this has been done?
 

Back
Top Bottom