• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Charles Stross on the Singularity.....

You answer your question within your question! :) Two people can't share the same position.
Exactly. And just from being in different locations with different perceptions of the universe, even if they're tiny, they'll develop differently.
Actually that remind me of the 'tachyon duplicate' scene from Spock Must Die!.
The characters spend far too much of that book musing over souls too.
ETA:
"With two Spocks on this ship, I must say, there ought to be no logical problem we can't lick."

"Unless," Spock One said, "we think exactly alike, in which case the replicate is simply a superfluity."

"Quite obviously you don't think exactly alike," Kirk said, "or both of you would have offered that remark simultaneously and in the same words."

"True but not relevant, Captain, if I may so observe," said Spock Two. "Even if we thought exactly alike at the moment of creation of the replicate, from then on our experiences differ slightly -- beginning, of course, with the simple difference that we occupy different positions in space-time. This will create a divergence in our thinking which will inevitably widen as time goes on."

"The difference, however, may remain trivial for some significant time to come," said Spock One.

"We are already disagreeing, are we not?" Spock Two said coldly. "That is already a nontrivial difference."
 
Last edited:
I am more interested in the meta analysis of Stross's article.

So much of what people call science fiction is really just fiction with lasers. SF authors assume the world and the motivations and fashions of the people inside it will remain pretty much as they are in the present, but with whatever SFey stuff they want to include tacked on. Thus we have AIs which are pretty much just smart humans, and moon bubbles surrounding faux-wood houses with white picket fences.

Imagine the shock one must feel when the realization sets in that the future will fundamentally change. That, y'know what, we really don't know how AIs will act. That there's almost no way to predict all the ripples of disruptive technology, or which will turn out to be tidal waves.

My favorite SF authors are those who know this, and give it a shot anyway. HG Wells was probably the best at predicting the future and coming up with something fairly believable. His "Anticipations" could have been used as a model for tech development and economic policy in the twentieth century. It's more than a century old, but it only reads like it was half a century old!

This same selective blindness is often present in people who call themselves "futurists." It generally means someone who won't shut the hell up about some niche idea or other that he swears will be totally revolutionary, and who uses meaningless buzzwords like they're paying him by the syllable.
 
Does anybody else read the OP as "Charlie SHEEN on the Singularity..".?

Single Malt he knows, Singularity, not so much. ;)
 
I never saw any sense in the singularity concept.
As I first encountered it, there seemed to be a semi-mystical notion of explaining the apparent absence of "post technological civilisations". I can think of many far more likely explanations for that.

As for the notion of convergence of technology and biology, well, artificial legs are improving, but we seem to have a long way to go before we upload (download?) ourselves into the cloud. But I expect it will be possible in Windows 8 , so long as we have a touch screen...and a robot finger to touch it.
 
Somebody believing a soul like entity would assume duplication is not possible or both body share the same soul. Where the heck do you see that in what is aid ?

The "duplication not possible"-part, here:

(...) all we have evidence for is that our "self" our consciousness is the emergent property of OUR brain function. Duplicating those function into a black box do not make us "move" magically to there, it just make a copy which can than onward evolve/think separately from the original.

I read that as you claiming there is a something ("us") that cannot be duplicated or "magically moved". Like, we have a perfect copy of the brain and its consciousness, but something is missing...

There is no soul, and the above does not add the soul entity. On the contrary the person stating that both duplicated person are the same are the one supposing there is an entity travelling from one to the other.

I'm not suggesting they are the same, rather that they are... interchangeable? Again, you are the one who bring up an "entity" that is somehow resistant to duplication. If it's a perfect copy, why wouldn't it include the entity too?

For those supposing the brain is the site of the self, it is VERY obvious that both person are similar but not the same, and indeed start diverging immediately after you remove your "pause". The nanoseconds afterward already their sisotopic composition may diverge, and otehr phsical process, the minutes afterward almost certainly the cell reparature, apoptose composition will diverge, and the neuron mapping and function and state will diverge.

I agree to all this. But back to my two identical copies and what makes one unique - the one you actually "continue in". If continuity is the key, what makes my instant duplication process not continuous enough? I ask because I'm genuinely interested in understanding what you mean.

You answer your question within your question! :) Two people can't share the same position.

:) To my defense, I fell out of bed last night and woke up on the floor, so I'm not quite myself today.

And if position really matters, I'll simply recalibrate my duplicator so that it in the process also swaps the positions of the original and the copy.
 
Last edited:
[edit] ffffffffffffffffffffff, wrong wort of singularity
 
Last edited:
In my opinion; Singularity is not when wonderful things happen, its when we can no longer grasp what is happening, and we get chaos.
I don't know that we'll get chaos (that's part of the point, we don't KNOW what we'll get), but it seems some people can't grasp what's happening now.

And in a real way, that's the point of much modern technology - you don't need to learn FORTRAN programming to use an iPod.
Yeah, that's much closer to the original meaning. The fundamental concept of the Singularity is that you can't predict the outcome. It's a technological (and by extension, social, economic, and political) discontinuity.

Stross's argument doesn't appear to be that the Singularity can't happen, but rather that we're not going to do it because of the risks involved.

This seems a weak argument to me.
Yeah, that's like we're not going to build the nuclear bomb because of the risks involved.
I never understood the cult for the singularity. I mean, even if mind uploading happens, you do not really "upload" your mind, you jsut put a copy of its memory, itnerraction, and personality ina comptuer system, so that virtually there would be no way to differentiate you from it. Just like the teleportation paradox, it suffers from the fact that you are just creating a copy, and if the original dies, "you" died. The copy as a separate entity might live and go on eternaly as long as electricity is paid, but the original Aepervius the human would have died.

Therefore I would not see that as immortality, more as a way to produce an immortal indentical twin offspring from me.

Mildly interresting , but not that much.
As others imply, it looks like you believe in an uncopyable soul. A lot of this ground was covered philosophically in (surely among many other places) the book "The Mind's I."

If all your friends and associates interact with this offspring, see that it has all your memories, personality, talents, quirks and stuff, and they thought it was you, it says it feels like you, and it CLAIMS to be you. in what way is it not you?
And "the singularity" isn't simply about mind uploading.
Uploading isn't even a necessary component of it. I'd say it's more of a predicted byproduct.
...
He also seems to be suggesting that the only way to develop AI that's capable of improving itself is to develop AI that works the same way as humans. That's just silly too: we build airplanes that fly better than birds without building artificial birds.
It's not like we're not building artificial birds either. Google:
Hummingbird NAV Flies Successfully

(getting around the "no URLs under 15 posts" thing)
To me the idea of creating an AI that is the same as a human seems a tad redundant since we already have 6 billion (and counting) of those! Also the environment that an AI is "born" into is not the same as that that nearly every human experiences so again I can't see any AI we develop will be human in the same way as we are.
It may well NOT be "human in the same way we are." That doesn't mean it won't be able to do many of the things humans do. Deep Blue didn't learn Chess the way humans learned it. Watson didn't learn to play Jeopardy the way Ken Jennings and Brad Rutter did - though after seeing the Nova episode on Watson, it looks as much like a really neat trick as much as a real advance.

Even with seeing the Nova episode that effectively shows a peek under Watson's hood, I find the Watson performance compelling. But then I've been a bit of a "believer" in the transhumanism movement for about ten years.
 
In my opinion; Singularity is not when wonderful things happen, its when we can no longer grasp what is happening, and we get chaos.

Who do you mean by "we", kemosabe?

But seriously. We don't get chaos. We get whatever we get. Maybe we get machines of loving grace. Maybe we get pure order, too beautiful and cruel for human expression to describe. Maybe we get a transhuman spacefaring civilization that makes the makers of the Internet look like pyramid builders, and makes pyramid builders look like gods. Maybe we get God Itself, in all Its Final Fantasy N+1 glory. Maybe we get the Unknown.

But unknown != chaos.
 
I read that as you claiming there is a something ("us") that cannot be duplicated or "magically moved". Like, we have a perfect copy of the brain and its consciousness, but something is missing...



The problem here is, if we had a "perfect copy of the brain and its consciousness", it's not that the second version would be missing "something", it's that we'd have two "somethings" (assuming that "something" actually means anything). From the perspective of the "something" that's stuck in the meat version, he still grows old and dies.

That the other version goes on to live an immortal life in some electronic version doesn't necessarily make the meat version feel better about that.


It's a bit like an even more personal version of this:

 
I never understood the cult for the singularity. I mean, even if mind uploading happens, you do not really "upload" your mind, you jsut put a copy of its memory, itnerraction, and personality ina comptuer system, so that virtually there would be no way to differentiate you from it. Just like the teleportation paradox, it suffers from the fact that you are just creating a copy, and if the original dies, "you" died. The copy as a separate entity might live and go on eternaly as long as electricity is paid, but the original Aepervius the human would have died.

Therefore I would not see that as immortality, more as a way to produce an immortal indentical twin offspring from me.

Mildly interresting , but not that much.

I was re-reading your post and it struck me how much what you describe is how many religious folks view their afterlife self today. And given that many people believe that they will continue in such a way with no evidence to back that up when there is evidence I don't think it will take much of a change for us to consider that to be personal immortality. I could even see us enacting laws that makes sure the meat version is destroyed after the upload is completed to keep everything nice and neat!
 
Once your brain is in the bottle, they will be feeding you matrix - like pap anyway.
You won't know the difference.
Meanwhile, we real world post sublimation survivors will be spending your money.
It's like the rapture, but those left behind have all the fun.
 
I don't know that we'll get chaos (that's part of the point, we don't KNOW what we'll get), but it seems some people can't grasp what's happening now.

And in a real way, that's the point of much modern technology - you don't need to learn FORTRAN programming to use an iPod.

Yeah, that's like we're not going to build the nuclear bomb because of the risks involved.

As others imply, it looks like you believe in an uncopyable soul. A lot of this ground was covered philosophically in (surely among many other places) the book "The Mind's I."

If all your friends and associates interact with this offspring, see that it has all your memories, personality, talents, quirks and stuff, and they thought it was you, it says it feels like you, and it CLAIMS to be you. in what way is it not you?

Uploading isn't even a necessary component of it. I'd say it's more of a predicted byproduct.

It's not like we're not building artificial birds either. Google:
Hummingbird NAV Flies Successfully

(getting around the "no URLs under 15 posts" thing)

It may well NOT be "human in the same way we are." That doesn't mean it won't be able to do many of the things humans do. Deep Blue didn't learn Chess the way humans learned it. Watson didn't learn to play Jeopardy the way Ken Jennings and Brad Rutter did - though after seeing the Nova episode on Watson, it looks as much like a really neat trick as much as a real advance.

Even with seeing the Nova episode that effectively shows a peek under Watson's hood, I find the Watson performance compelling. But then I've been a bit of a "believer" in the transhumanism movement for about ten years.

If you uploaded your brain into a computer you have created an identical twin. Do you think one of identical twins wouldn't mind dieing since he will live on in his twin?
 
Last edited:
The problem here is, if we had a "perfect copy of the brain and its consciousness", it's not that the second version would be missing "something", it's that we'd have two "somethings" (assuming that "something" actually means anything).

I agree to all this. There are two somethings, their only difference being that one sits in the same position as it did before, and the other one doesn't. But I fail to see the connection between that, and our notion of "self" and continuity, or any reason why the copy is not as valid as the original.

Concerning the "something", I would assume it is the current state of the brain. What do you assume it is?

From the perspective of the "something" that's stuck in the meat version, he still grows old and dies.

This is the part I don't understand. I see a statement that something is stuck in the meat version, but no basis for that claim. In my perfect copy, I would assume I also have a perfect copy of that "something" (whatever it is) since it's a perfect copy. And then I can claim that nothing is stuck in the meat version.

That the other version goes on to live an immortal life in some electronic version doesn't necessarily make the meat version feel better about that.

Agreed, and that's why common sense tells me never to set foot in a teleporter. But when I try to get to the root of the problem I keep failing.
 
Last edited:
This is the part I don't understand. I see a statement that something is stuck in the meat version, but no basis for that claim. In my perfect copy, I would assume I also have a perfect copy of that "something" (whatever it is) since it's a perfect copy. And then I can claim that nothing is stuck in the meat version.


There isn't anything "stuck" in the meat version in the sense that it is missing from the copy; it is just that they are not the same entity.

Say that someone has already made a perfect copy of you without your knowledge. You are unaware that there is a copy. You are still you. There is now a second "you" (which may also believe it is "you"), but that doesn't mean it is actually you. Having two identical entities does not make them the same entity.
 
I agree to all this. There are two somethings, their only difference being that one sits in the same position as it did before, and the other one doesn't. But I fail to see the connection between that, and our notion of "self" and continuity, or any reason why the copy is not as valid as the original.

...snip...

Not too sure what you mean by "as valid"? If you duplicated me, there would now be two people DaratA and DaratB. But I would still be DaratA, and there would be someone else who is DaratB and would say that they are DaratB. We may share the exact same memories and so on but to me i.e. DaratA is not interchangeable with DaratB and vice-a-versa since we are different people. If DaratA was killed after the duplication and DaratB continued to live the "Darat" life then there would be no difference that folks or indeed DaratB himself could detect but to DaratA there is a bloody big difference that granted I also couldn't detect but that is because I would be dead!
 
Not too sure what you mean by "as valid"? If you duplicated me, there would now be two people DaratA and DaratB. But I would still be DaratA, and there would be someone else who is DaratB and would say that they are DaratB. We may share the exact same memories and so on but to me i.e. DaratA is not interchangeable with DaratB and vice-a-versa since we are different people. If DaratA was killed after the duplication and DaratB continued to live the "Darat" life then there would be no difference that folks or indeed DaratB himself could detect but to DaratA there is a bloody big difference that granted I also couldn't detect but that is because I would be dead!

I think he is suggesting that "you" might be DaratB and not DaratA.
 
I think he is suggesting that "you" might be DaratB and not DaratA.

In that case I think that I would quite easily accept that "I" was continuing but I'm sure (if anyone should know it would be me ;) ) DaratA would be quite narked if I tried to use his credit cards to pay for something!
 
There is now a second "you" (which may also believe it is "you"), but that doesn't mean it is actually you. Having two identical entities does not make them the same entity.



Here's the fundamental issue: it's not that the copy won't think he's me - it's that the original will still think that as well, right up until he dies.

Regardless of what it is that you think "I" am (a soul/spirit, an emergent property of a biological system, or just some arrangement of matter or information), a "perfect" copy will have the same thing (else, it's not perfect). But the fact that there's a second "I" out there doesn't eliminate the first "I".

Now, there may be good reasons for me to do this anyways, but the current "me" is still going to get old and die, regardless.
 
If you uploaded your brain into a computer you have created an identical twin. Do you think one of identical twins wouldn't mind dieing since he will live on in his twin?
The "meat" original me will still fasten my seat belt, attempt CRON diet, and look forward to "escape velocity" when the average lifespan increases by a year or more every year and stuff. That there's another "me" in a machine that may well outlive me, well, I have no problem with that.
 
The notion of The Singularity is nonsense, and the crowd of hangers-on, who lust for some form of immortality with all of the intellectual integrity of a Fundie, wouldn't be worth spending 64 bits on.
 

Back
Top Bottom