• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Ethical implications of future technology

Zelenius

Muse
Joined
Jul 8, 2008
Messages
908
What are the ethical implications of future technology? We already have some of these "future" technologies, but they are still in their infancy. I'm talking about 3 areas in particular, nanotechnology, biotechnology and quantum computing and the various ways these can be used together in the future to:

1) Create weapons of mass destruction so powerful that they may vastly overshadow even the most destructive nuclear weapons currently in existence.

2) Create robot armies that will lead to having to use fewer human soldiers(or perhaps use biotechnology to breed people to be ideal, obedient soldiers or citizens).

3) Super computers using quantum computing/artificial intelligence/self-learning software that allows them to make nearly flawless predictions and trades when playing the financial markets. This could easily lead to markets getting manipulated by super-computers, to serve the interests of those who control these trading computers. This is already happening, but it is still in its early stages.

4) Biotechnology so powerful that there will be cures for almost all diseases, and the human lifespan is doubled or tripled. Eternal youth could eventually be achieved through genetic engineering for those who can afford it. Super-intelligence could similarly be achieved through genetic and neuro-engineering, in a manner similar to eugenics.

5) Governments using super computers, various forms of advanced surveillance technologies and micro-robots to spy on its citizens. It's possible that robots the size of mosquitoes or smaller could be used to spy on people the government claims are a "threat". Beyond this, and perhaps in combination with #2, government leaders could use advanced mind control to make sure no one opposes them.

6) Similar to #2, corporations using biotechnology and/or nanotechnology to create(or alter) ideal, robotic, always obedient employees who never take vacations and always do what they are told.

7) Science and mathematics becoming revolutionized. Will all this quantum computing and artificial intelligence make doing science so much more faster and "easy" that new, paradigm-shifting scientific discoveries will come about that will greatly alter our way of looking at the world and how we deal with it? Will these discoveries hurt or help religion, and culture as well as secular humanism? Will these discoveries ultimately help humankind? Ultimately, this becomes a positive feedback loop with technology - the more science advances, the more technology advances, and the more technology advances, the more science advances, etc.

8) (Remotely possible) Time travel, long distance space travel, teleportation, human brain uploading into computers.

Are these very possible, and what are the ethical implications of them? Especially with regard to #5, would privacy laws have to be changed to prevent this kind of intrusion? Could this undermine democracy?

I don't think it is necessary for me to explain "how" scientists and engineers would achieve these various feats, like making people smarter for instance, or the insect robots.
 
Last edited:
I don't think it is necessary for me to explain "how" scientists and engineers would achieve these various feats, like making people smarter for instance, or the insect robots.


No... of course you don't.

Are you INRM's sock, or just on the payroll?
 
Jim MDP

So because I question the path technological progress takes sometimes, anybody who questions the path that technological progress takes must be me, or payed by me?
 
This seems like half a dozen discussions all at once. Each of the examples you gave raises a different set of ethical implications. I don't see how we can have a coherent discussion of so many topics simultaneously, unless we can find a common denominator.

I can think of several ways to do it, but I don't want to hijack the topic. Zelenius, why don't you go first? What do you think are the ethical implications of future technology?
 
The base implication is that, in the next hundred years, the human race is going to develop technology that could ultimately destroy the human race in *SO* many ways. Therefore, the base question is will the human race develop the moral ethics to know when to safely use these technologies.

As an example, Ray Kurzweil seems to think that if he lives another 25 to 50 years, he'll be able to live forever. This is seemingly the utopian ideal -- that we could eradicate disease and all live seemingly forever. The downside, of course, is that, if we all live forever, then the world will become a very crowded place. It is human nature that, when we become too crowded, friction develops which leads to wars. Therefore, this technology could lead to the Mother of All Wars in a quite literal sense.

So the base question here is, do we know when to use new technology and do we know when *NOT* to use it??
 
The base implication is that, in the next hundred years, the human race is going to develop technology that could ultimately destroy the human race in *SO* many ways. Therefore, the base question is will the human race develop the moral ethics to know when to safely use these technologies.

As an example, Ray Kurzweil seems to think that if he lives another 25 to 50 years, he'll be able to live forever. This is seemingly the utopian ideal -- that we could eradicate disease and all live seemingly forever. The downside, of course, is that, if we all live forever, then the world will become a very crowded place. It is human nature that, when we become too crowded, friction develops which leads to wars. Therefore, this technology could lead to the Mother of All Wars in a quite literal sense.

So the base question here is, do we know when to use new technology and do we know when *NOT* to use it??

You raise some interesting scenarios. It is pretty obvious that if humans achieve near immortality through technology, the resulting extreme overpopulation won't make life worth living anyway, at least in the beginning.

Obviously there will be big wars and famine to reduce this excessive growth, along with possibly mandatory birth control for just about everyone, since having children would be superfluous in a world where people could live almost forever. We may also experience a revival of eugenics. Obviously, ethics will change, they have always been subject to change based on circumstance.

It seems to me that the "value" of children is in how they allow us to "live on" after we die or they will take care of us when we're old. Personally, I never want any children, I don't see the point, but these seem to be the "reasons" many people have children. The value of children will become severely diminished in a world where humans can live almost forever and there is massive overpopulation and resource depletion, in my opinion. This may only apply to the Developed world, where family sizes have shrunk considerably over the past several decades, since economically, children are more expensive than ever and western, educated women usually have better things to do than just get pregnant all the time.

We may also look at suicide differently, and may even encourage it for certain types of people, if overpopulation becomes increasingly severe. Ultimately, overpopulation just cheapens life.

As for your question that is in a way a simplification of my OP, I don't think we will have the wisdom to use this future technology properly or ethically, and could very well bring about the extinction of humanity through this technology. We are just apes after all.
 
We already have overpopulation and wars and famine. I don't think that "near immortality" will significantly worsen it. Just because a technology is available doesn't mean everyone will have access to it. Life-extending technologies will be the privilege of the rich, while most others will continue to grow old and die as usual. It certainly would give the term "right to life" a whole new meaning, however.

It's true that many people will opt not to have children in the future. We already see that in declining birth rates for modern democracies and the rich in general, but that has more to do with the availability of contraception than any concerns about overpopulation. However, scarcity of something usually increases its value. In a society where children are a rare phenomenon, I expect that people would become obsessed with them. I'm not sure what form that would take. Probably children would grow up in a "Truman Show"-like atmosphere, with the entire community as an audience.

I can definitely see limits on the right to have children (already beginning in China), and frankly it's about time. IMHO just because you have the biological ability to make a baby doesn't mean you are qualified to care for it, or that an overcrowded world should have to accommodate it.
 
Jim MDP

So because I question the path technological progress takes sometimes, anybody who questions the path that technological progress takes must be me, or payed by me?

Not necessarily. They could be clones produced from your stolen DNA. ;)
 
As for your question that is in a way a simplification of my OP, I don't think we will have the wisdom to use this future technology properly or ethically, and could very well bring about the extinction of humanity through this technology. We are just apes after all.

We may squirt populations of ourselves off planet before we "end game" here on Earth if technological advancement proceeds fast enough. We have a celestial body with water right next to us; the moon where we can squirt a number of us. Technological development is at different levels at different parts of this planet as well. Some places are near pre-industrial and others are post-industrial economies, all of us living together unpeacefully.:rolleyes:
 
Technology is a human invention with all our strengths and weaknesses built in.

The question is how do we work on our weaknesses before we build powerful technology with them built in?

Singularity advocates forget that it's all very well having the technology to live forever it's another matter to actually survive.
 
The "value" of children is in the evolution of the species and the new ways of solving problems we get thru the recombining of genes. Stop having children and the species stagnates and eventually dies when it can't cope with the ever changing forces of nature. So, the idea of immortality may be highly alluring to us individually, but is it really of value to the species?
 
I'm pretty sure it's hubris to talk about the ethical implications of future technology when we still haven't figured out the ethical implications of the Internet.

Those that fail to learn from the past are doomed to repeat it...?
 
We may squirt populations of ourselves off planet before we "end game" here on Earth if technological advancement proceeds fast enough. We have a celestial body with water right next to us; the moon where we can squirt a number of us. Technological development is at different levels at different parts of this planet as well. Some places are near pre-industrial and others are post-industrial economies, all of us living together unpeacefully.:rolleyes:

Question is can we "squirt" enough off planet fast enough to prevent the population bomb from detonating?
 
The "value" of children is in the evolution of the species and the new ways of solving problems we get thru the recombining of genes. Stop having children and the species stagnates and eventually dies when it can't cope with the ever changing forces of nature. So, the idea of immortality may be highly alluring to us individually, but is it really of value to the species?

I think we've already moved beyond biological evolution -- not that we won't continue to evolve biologically, but that other adaptation techniques have completely overtaken biology. It's now memes, not genes. Humans are now living in environments (arctic, desert, urban) where any animal with our biology but not our intelligence and advanced social structures wouldn't last a month. And if we ever expand beyond this planet, you can be sure it will be via technological adaptation, not biology.

But either way, we both agree that children will always have value.
 
I think we've already moved beyond biological evolution -- not that we won't continue to evolve biologically, but that other adaptation techniques have completely overtaken biology. It's now memes, not genes. Humans are now living in environments (arctic, desert, urban) where any animal with our biology but not our intelligence and advanced social structures wouldn't last a month. And if we ever expand beyond this planet, you can be sure it will be via technological adaptation, not biology.

But either way, we both agree that children will always have value.

Sure the first time we picked up a piece of red ocre and scratched on a rock it was all "technology" from there on.

Before the "abstract representations" we were apes subject to biology.

No we are apes subjecting biology.
 
I think we've already moved beyond biological evolution -- not that we won't continue to evolve biologically, but that other adaptation techniques have completely overtaken biology. It's now memes, not genes. Humans are now living in environments (arctic, desert, urban) where any animal with our biology but not our intelligence and advanced social structures wouldn't last a month. And if we ever expand beyond this planet, you can be sure it will be via technological adaptation, not biology.

But either way, we both agree that children will always have value.

Of course, when you start talking about "memes, not genes", the meaning of children changes completely. Currently, due to the recombination of genes, children are something new. In the future, due to complete control via computers, children are more likely to be clones of us. This could lead to stagnation in the species as we use computers to take evolution the way we want rather than by what is dictated from the environment.
 
Of course, when you start talking about "memes, not genes", the meaning of children changes completely. Currently, due to the recombination of genes, children are something new. In the future, due to complete control via computers, children are more likely to be clones of us. This could lead to stagnation in the species as we use computers to take evolution the way we want rather than by what is dictated from the environment.

I'm not following you. If we "take evolution the way we want", then how is that stagnation? And if we have "complete control" of the genetic characteristics of our offspring, why would that imply identical clones?

Personally, I've never understood the appeal of human cloning. Making babies the old-fashioned way is much more fun. ;)
 

Back
Top Bottom