• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Charles Stross on the Singularity.....

catsmate

No longer the 1
Joined
Apr 9, 2007
Messages
34,767
.....and why he doesn't believe in it. Also the theological implications of 'uploading' and why he isn't a libertarian. Here.
Very active comments section.
 
.....and why he doesn't believe in it. Also the theological implications of 'uploading' and why he isn't a libertarian. Here.
Very active comments section.

I quote for emphasis from his notes:

I'm going to take it as read that you've read Vernor Vinge's essay on the coming technological singularity (1993), are familiar with Hans Moravec's concept of mind uploading, and know about Nick Bostrom's Simulation argument. If not, stop right now and read them before you continue with this piece. Otherwise you're missing out on the fertilizer in which the whole field of singularitarian SF, not to mention posthuman thought, is rooted. It's probably a good idea to also be familiar with Extropianism and to have read the posthumanism FAQ, because if you haven't you'll have missed out on the salient social point that posthumanism has a posse.

And where are the Matrioshka brains?

In my opinion; Singularity is not when wonderful things happen, its when we can no longer grasp what is happening, and we get chaos.
The future, in which we can not comprehend what is happening, is a subset of the total group of time periods in which we cannot grasp what is happening, which includes, of course, the past, present and the future.

We simply cope by ignoring those aspects of reality.

So there isn't anything to "worry about" regarding a technological singularity.
 
Last edited:
In my opinion; Singularity is not when wonderful things happen, its when we can no longer grasp what is happening, and we get chaos.
Yeah, that's much closer to the original meaning. The fundamental concept of the Singularity is that you can't predict the outcome. It's a technological (and by extension, social, economic, and political) discontinuity.

Stross's argument doesn't appear to be that the Singularity can't happen, but rather that we're not going to do it because of the risks involved.

This seems a weak argument to me.
 
I never understood the cult for the singularity. I mean, even if mind uploading happens, you do not really "upload" your mind, you jsut put a copy of its memory, itnerraction, and personality ina comptuer system, so that virtually there would be no way to differentiate you from it. Just like the teleportation paradox, it suffers from the fact that you are just creating a copy, and if the original dies, "you" died. The copy as a separate entity might live and go on eternaly as long as electricity is paid, but the original Aepervius the human would have died.

Therefore I would not see that as immortality, more as a way to produce an immortal indentical twin offspring from me.

Mildly interresting , but not that much.
 
I never understood the cult for the singularity. I mean, even if mind uploading happens, you do not really "upload" your mind, you jsut put a copy of its memory, itnerraction, and personality ina comptuer system, so that virtually there would be no way to differentiate you from it. Just like the teleportation paradox, it suffers from the fact that you are just creating a copy, and if the original dies, "you" died. The copy as a separate entity might live and go on eternaly as long as electricity is paid, but the original Aepervius the human would have died.
But in a similar way the you who existed ten years ago is also dead. I don't know that I see anything particularly appealing about "mind uploading", but I can see ways in which it could be made to at least seem to work, particularly if done one step at a time.

And "the singularity" isn't simply about mind uploading.
 
This part seems silly to me:
First: super-intelligent AI is unlikely because, if you pursue Vernor's program, you get there incrementally by way of human-equivalent AI, and human-equivalent AI is unlikely. The reason it's unlikely is that human intelligence is an emergent phenomenon of human physiology, and it only survived the filtering effect of evolution by enhancing human survival fitness in some way. Enhancements to primate evolutionary fitness are not much use to a machine, or to people who want to extract useful payback (in the shape of work) from a machine they spent lots of time and effort developing. We may want machines that can recognize and respond to our motivations and needs, but we're likely to leave out the annoying bits, like needing to sleep for roughly 30% of the time, being lazy or emotionally unstable, and having motivations of its own.

Basically he seems to be saying that to develop AI capable of developing more intelligent AI (a useful thing) you need a machine that is as intelligent as a human, but, because humans have aspects of them that aren't useful (like needing to sleep) we'll never bother developing that sort of AI.

That just seems silly to me. Either sleeping is a necessary component of intelligent entities (including AI) in which case we'd include it, or it's not, in which case we wouldn't bother. We're not just going to say, "Oh, to build intelligent AI I need to include the need for sleep, but sleep sucks, so let's just forget about it."

He also seems to be suggesting that the only way to develop AI that's capable of improving itself is to develop AI that works the same way as humans. That's just silly too: we build airplanes that fly better than birds without building artificial birds.
 
But in a similar way the you who existed ten years ago is also dead. I don't know that I see anything particularly appealing about "mind uploading", but I can see ways in which it could be made to at least seem to work, particularly if done one step at a time.

And "the singularity" isn't simply about mind uploading.

There is a continuity of body between the me of 10 years ago and the me of today, and at EACH step the neural network was maintained alive, even if bit and piece of it was quickly replaced and repaired.

There isn't such a continuity when you upload a copy, even if you immediately kill the original in the process.

Thinking that you really "upload" the person emans you think there is some magical soul like entity which is removed from the body/brain and put in the box. But there is no evidence for this, all we have evidence for is that our "self" our consciousness is the emergent property of OUR brain function. Duplicating those function into a black box do not make us "move" magically to there, it just make a copy which can than onward evolve/think separately from the original. Worst, duplicate the signal (*) and you can probably create more than one copy of the uploaded person. That make it even more clear what I mean.

(*) yeah I know I am speculating on fiction.

Anyway if you want to continue the discussion there is a thread on the first page on why "materialist are wrong" from DOC.


ETA: the way I would see this working, is that you replace over long period of time bit by bit every neurone by a silicon/simulacre. By the end you would have a consciousness on simulacre, but it begs the question if it would be the same persons, just like brain damaged person are "changed" (they are the same entity but with personality changed or warped).

A difficult one I don't think would be easy to answer without making the experiment first.
 
Last edited:
Anyway if you want to continue the discussion there is a thread on the first page on why "materialist are wrong" from DOC.

Not really interested in that discussion, but I simply wanted to make it clear that there is a discussion to be had about whether or not such "mind uploading" is a viable way to extend a personal experience of life.
 
I never understood the cult for the singularity. I mean, even if mind uploading happens, you do not really "upload" your mind, you jsut put a copy of its memory, itnerraction, and personality ina comptuer system, so that virtually there would be no way to differentiate you from it. Just like the teleportation paradox, it suffers from the fact that you are just creating a copy, and if the original dies, "you" died. The copy as a separate entity might live and go on eternaly as long as electricity is paid, but the original Aepervius the human would have died.

Therefore I would not see that as immortality, more as a way to produce an immortal indentical twin offspring from me.

Mildly interresting , but not that much.

I suggest that view is a product of our upbringing and or social and cultural mores (I share it by the way). If we were brought up with the idea that we change substrate at some point in our life then our views would follow the syntax of this sentence!
 
To me the idea of creating an AI that is the same as a human seems a tad redundant since we already have 6 billion (and counting) of those! Also the environment that an AI is "born" into is not the same as that that nearly every human experiences so again I can't see any AI we develop will be human in the same way as we are.
 
...snip...

ETA: the way I would see this working, is that you replace over long period of time bit by bit every neurone by a silicon/simulacre. By the end you would have a consciousness on simulacre, but it begs the question if it would be the same persons, just like brain damaged person are "changed" (they are the same entity but with personality changed or warped).

A difficult one I don't think would be easy to answer without making the experiment first.

Quite a few authors have explored those ideas, they make interesting reading. ETA: There is an entertaining but thought provoking sort-of-series of short stories by Greg Egan in which a "jewel" is implanted into someone's head and it mimics the develop of the human brain (shares the input, matches the changes the human brain under goes) and then at some arbitrary point folks decide to have the human brain removed and they continue (or do they...) running on the immortal jewel.
 
Last edited:
Quite a few authors have explored those ideas, they make interesting reading. ETA: There is an entertaining but thought provoking sort-of-series of short stories by Greg Egan in which a "jewel" is implanted into someone's head and it mimics the develop of the human brain (shares the input, matches the changes the human brain under goes) and then at some arbitrary point folks decide to have the human brain removed and they continue (or do they...) running on the immortal jewel.

That remind me of "battle angel alita" where the people of the city in the sky at some arbitrary point in their teenage get their brain removed and replaced with a chip. But without them knowing (it is called a rite of passage in adulthood). I do not exactly remember the justification on why they do that, but it is clear that once they discover they are in reality a chip only, they simply go insane or desperate.
 
There is a continuity of body between the me of 10 years ago and the me of today, and at EACH step the neural network was maintained alive, even if bit and piece of it was quickly replaced and repaired.

There isn't such a continuity when you upload a copy, even if you immediately kill the original in the process.

Thinking that you really "upload" the person emans you think there is some magical soul like entity which is removed from the body/brain and put in the box. But there is no evidence for this, all we have evidence for is that our "self" our consciousness is the emergent property of OUR brain function. Duplicating those function into a black box do not make us "move" magically to there, it just make a copy which can than onward evolve/think separately from the original. Worst, duplicate the signal (*) and you can probably create more than one copy of the uploaded person. That make it even more clear what I mean.

This continuity you speak of, what is it exactly?

Imagine that I make a copy of you, and pause time at the exact moment of (instant) duplication. Now I have two identical units with no hair. What sets the original apart?

From my position, you are the one who believe in a soul like entity.
 
This continuity you speak of, what is it exactly?

Imagine that I make a copy of you, and pause time at the exact moment of (instant) duplication. Now I have two identical units with no hair. What sets the original apart?

From my position, you are the one who believe in a soul like entity.

Somebody believing a soul like entity would assume duplication is not possible or both body share the same soul. Where the heck do you see that in what is aid ?

Nope. Once the brain and the hardware is destroyed it is over for us. Even if a duplicate was created before, it is NOT the same person. It is simply accidentally a different hardware which is non distinguishable in memory and action.

If i took a dice , and reproduced it layer by layer to have a dice so similar as to be non distinguishable, would you say it is the SAME dice ? no you would not. Similarly a brain duplicated would be SIMILAR, and non distinguishable, but it would not be the same person.

There is no soul, and the above does not add the soul entity. On the contrary the person stating that both duplicated person are the same are the one supposing there is an entity travelling from one to the other. For those supposing the brain is the site of the self, it is VERY obvious that both person are similar but not the same, and indeed start diverging immediately after you remove your "pause". The nanoseconds afterward already their sisotopic composition may diverge, and otehr phsical process, the minutes afterward almost certainly the cell reparature, apoptose composition will diverge, and the neuron mapping and function and state will diverge.

Stating that after duplication both are similar but not the same, is not and has never been stating soule xists, that is silly to the extrem.
 
This continuity you speak of, what is it exactly?

Imagine that I make a copy of you, and pause time at the exact moment of (instant) duplication. Now I have two identical units with no hair. What sets the original apart?

From my position, you are the one who believe in a soul like entity.

You answer your question within your question! :) Two people can't share the same position.
 

Back
Top Bottom