• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Artificial Intelligence thinks mushroom is a pretzel

In principle it should be possible, I agree. But I don't see it being done with current computer technology, or even by hypothetical vastly more sophisticated versions of current computer technology.

Yes, well, I didn't say it was possible right now. Just that as long as there is no real argument for why that can't be done even theoretically, well, there also inherently none against computers being able to think.

Would you then be okay with replacing your loved ones with exact android duplicates?

Think positively. If mom needed a power cord, I could just sit out of range to avoid her riding my shiny metal ass ;)

Ok, joke aside, why not? Unless you want to believe in some kind of dualism, where some soul is taking all the decisions, everything you say or do is just a function of the configuration of synapses in your brain. A probabilistic function, but still, a function.

If someone took your brain and put it in another body, it would be you in a new body. If someone made a map of your synapse configuration and copied it exactly to someone else's synapses, it would still be you in a new body. If someone took that synapse configuration and put it into a simulated brain on a super-computer, well, it would be you in a computer.

So let's also think of it from the other side. Would I mind it if mom got a mechanical heart instead of the real one? Nope. Not to mention that I'd assume that if that was needed, it's at the very least better than the alternative. Would I mind it if mom got a robot leg or arm? Nope. What about both? Still nope. And so on. At what point would it become a deal breaker then? Why would a whole robot body be waay over the line?
 
Ok, joke aside, why not? Unless you want to believe in some kind of dualism, where some soul is taking all the decisions, everything you say or do is just a function of the configuration of synapses in your brain. A probabilistic function, but still, a function.

I disagree. A person is more than a sum of their parts.

If someone took your brain and put it in another body, it would be you in a new body.

I agree.

If someone made a map of your synapse configuration and copied it exactly to someone else's synapses, it would still be you in a new body.

No, it wouldn't. A copy is not the original.

If someone took that synapse configuration and put it into a simulated brain on a super-computer, well, it would be you in a computer.

Nope. Still a copy.

So let's also think of it from the other side. Would I mind it if mom got a mechanical heart instead of the real one? Nope. Not to mention that I'd assume that if that was needed, it's at the very least better than the alternative. Would I mind it if mom got a robot leg or arm? Nope. What about both? Still nope. And so on. At what point would it become a deal breaker then? Why would a whole robot body be waay over the line?

The body, fine. The brain, nope. You don't have to believe in a soul to believe a person ceases to be when they lose their brain, or that a copy of a brain is not the same individual as the original.

You're basically arguing that if I had a clever enough Xerox machine you'd be okay with me murdering you, so long as another individual existed afterward who was sufficiently similar to you. The copy wouldn't be you. You'd be a separate person, who is dead.
 
You're basically arguing that if I had a clever enough Xerox machine you'd be okay with me murdering you, so long as another individual existed afterward who was sufficiently similar to you. The copy wouldn't be you. You'd be a separate person, who is dead.

No he's not, he's arguing that his wife wouldn't mind you killing him if a completely undetectable copy of him went home that evening.
 
No he's not, he's arguing that his wife wouldn't mind you killing him if a completely undetectable copy of him went home that evening.

Undetectable to whom?

I'll concede that a copy with the same memories, and no idea that it's a copy, is probably the same person, or close enough.

A copy that knows its true provenance, though, probably isn't.
 
Undetectable to whom?

I'll concede that a copy with the same memories, and no idea that it's a copy, is probably the same person, or close enough.

A copy that knows its true provenance, though, probably isn't.

I submit that a copy who replies to the question ' Are you the real Guybrush Threepwood?' With 'No I'm a copy' fails to fulfil the condition of being undetectable by quite a long way. :)
 
In principle it should be possible, I agree. But I don't see it being done with current computer technology, or even by hypothetical vastly more sophisticated versions of current computer technology.
If you can do one you can do a hundred, you can do a million, a billion and eventually 100 billion.
The Blue Brain project is currently simulating 30 000 neurons and 8 million connections (down to the molecular level!!!) to understand the principles behind how neurons and the brain function exactly.*
Once those principles are understood it would be much simpler to model large numbers of neurons with all the needed connections and functions.
We have also only recently started designing and building hardware specifically to emulate neural networks. Probably lots of room for improvement there.

Blue Brain already have a model to simulate the complete mouse cortex and are ready to start running EEG experiments on it, but the model is too 'heavy' for the super computers they use.
Modelling a human brain might not be as far in the future as some think.

*If you are interested, I think it's really cool:

In all mammals the arrangement of neurons in the cortex is very similar. It seems the neuron is not the functional element of data processing in the brain, but rather something called a cortical column.
A CC consists of six layers and contains multiple different types of neurons that are functionally and structurally distinct.
The mouse has about 30 000 neurons and humans 100 000 per CC.
A mouse CC has now been modeled down to the molecular level and the human one is next.
Once we understand how all the different neurons in a human CC function together and how a CC works as a whole, it would be possible to model a CC at the neural, rather than molecular level. This would immediately increase the size of brain we could model accurately by orders of magnitude.
Or even better, it might be possible to model a brain at the CC, rather than neuron level, even more efficient. Humans only have around one to two million cortical columns.
 
Last edited:
Right. So we're now down to whether there's a difference between the original and an _identical_ copy. Fine by me, actually.

Well, we're back to where you draw the line. Let's start simple. Are you the same person as you were 30 years ago? Are any of your relatives / friends / loved ones / whatever, the same person they were 30 years ago?

Well, according to Stanford University, it takes about 10 years for almost every cell in your body to have died and been replaced with a new one. Some parts are refreshed much faster, but 10 years is what it takes to be reasonably sure that there's buggerall left of the original. So after 30 years, not only you ARE a copy, but you're actually a copy of a copy of a copy of the original person. Ditto for your wife or whatever.

Does that qualify as a different person?

If no, then what difference does it make? I mean, what functional difference anyway?
 
Right. So we're now down to whether there's a difference between the original and an _identical_ copy. Fine by me, actually.

Well, we're back to where you draw the line. Let's start simple. Are you the same person as you were 30 years ago? Are any of your relatives / friends / loved ones / whatever, the same person they were 30 years ago?

Well, according to Stanford University, it takes about 10 years for almost every cell in your body to have died and been replaced with a new one. Some parts are refreshed much faster, but 10 years is what it takes to be reasonably sure that there's buggerall left of the original. So after 30 years, not only you ARE a copy, but you're actually a copy of a copy of a copy of the original person. Ditto for your wife or whatever.

Does that qualify as a different person?

If no, then what difference does it make? I mean, what functional difference anyway?

What is the being you consider to be "you"? Is your concept of identity so relaxed you don't believe in your own existence? Somebody else sufficiently like you is you? That's not just bad science and worse philosophy, that's literally insane. People who don't recognize themselves as distinct entities are crazy.
 
Star Trek Transporter Paradox.

That is all.

Even Tom Riker didn't imagine he was a commander just because Wil was. They didn't see through each other's eyes, or pay each other's credit card bills, or even date the same person. They knew they were two separate people, whatever their origins. And if one killed the other he'd be a murderer.

Ironic, seeing how Wil did kill a clone of himself in an earlier episode, but somehow that wasn't a problem! (Nobody minded when Pulaski killed her clone.)
 
What is the being you consider to be "you"? Is your concept of identity so relaxed you don't believe in your own existence? Somebody else sufficiently like you is you? That's not just bad science and worse philosophy, that's literally insane. People who don't recognize themselves as distinct entities are crazy.

Considering that the scenario you have been proposing is a literal case of Capgras Syndrome, I would advise against going down the route of decreeing who's crazy if they disagree with you :p

Even Tom Riker didn't imagine he was a commander just because Wil was. They didn't see through each other's eyes, or pay each other's credit card bills, or even date the same person. They knew they were two separate people, whatever their origins. And if one killed the other he'd be a murderer.

The point you're missing is that at the moment of cloning both thought that they ARE Wil Riker. Both had the same identity.

And that if the one whose beam had made it to the Enterprise had been killed in a transporter accident (one of the most common hazards in Starfleet:p) -- say, their beam decoherred or whatever technobabble, and they couldn't materialize him -- and they had found the other one still on the planet, everyone would have had no problem thinking he IS Wil Riker. They wouldn't even think, "oh, we still have a copy on the planet." They would have just thought that the transporter didn't work at all, and Wil is still on the planet.

Later one of them took another name to avoid confusion, but that's no different than if I started to call myself Max instead of Hans to avoid confusion with another Hans at the office.
 
Last edited:
Finally, you don't seem to understand what he meant by the transporter paradox in the first place.

The problem is that once you went through the transporter once, you ARE a copy. Even if Wil hadn't split in that particular incident, he would have just been copied once more, on top of the hundreds of times it already had happened, and the original destroyed.

But nobody, not even themselves, thinks "oh, that's totally not Riker" when they first used a transporter.
 
Even Tom Riker didn't imagine he was a commander just because Wil was. They didn't see through each other's eyes, or pay each other's credit card bills, or even date the same person. They knew they were two separate people, whatever their origins. And if one killed the other he'd be a murderer.

Ironic, seeing how Wil did kill a clone of himself in an earlier episode, but somehow that wasn't a problem! (Nobody minded when Pulaski killed her clone.)

I wasn't familiar with that episode, so I looked it up.

Memory Alpha said:

The transporter must have driven them crazy!;)
 
Finally, you don't seem to understand what he meant by the transporter paradox in the first place.

The problem is that once you went through the transporter once, you ARE a copy. Even if Wil hadn't split in that particular incident, he would have just been copied once more, on top of the hundreds of times it already had happened, and the original destroyed.

But nobody, not even themselves, thinks "oh, that's totally not Riker" when they first used a transporter.

But neither Riker ever thinks they are one person. They agree, as does everyone else, that there are now two separate Rikers. That's my point here: being identical in every way does not make two beings one.
 
But neither Riker ever thinks they are one person. They agree, as does everyone else, that there are now two separate Rikers. That's my point here: being identical in every way does not make two beings one.

Yes, we all agree on that, it was never in dispute as far as I'm aware. The point is, at the moment of the transporter malfunction there are two real Rikers (or two fake ones depending on your point of view) and it doesn't matter which one you kill, you are left with the real Will Riker.
 
But neither Riker ever thinks they are one person. They agree, as does everyone else, that there are now two separate Rikers. That's my point here: being identical in every way does not make two beings one.

I don't think anyone claimed that two beings are one.

But you still don't fully realize the implication there. BOTH are equally copies of the original Riker that was transported. (Who in turn was a copy of a copy of a copy of a copy.) Yet nobody has any problem treating one of them as the "real" Riker.
 
Yes, we all agree on that, it was never in dispute as far as I'm aware. The point is, at the moment of the transporter malfunction there are two real Rikers (or two fake ones depending on your point of view) and it doesn't matter which one you kill, you are left with the real Will Riker.

Except from the POV of the Riker you just killed. Once you concede they are two separate beings you have granted then both personhood, and it matters very much to a person whether it exists or not. One being's similarity to another doesn't render it a nonbeing.
 

Back
Top Bottom