• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Artificial Intelligence

Primus

Graduate Poster
Joined
Dec 9, 2007
Messages
1,191
This train of thought has probably been brought on my watching too many daft sci-fi films recently but still.
At what point would something make the transition from a machine to a new form of life.
Would it be if it became self aware or if it was able to reproduce. If machines were separated from human civilisation and evolved to the point where they were not recognizable from the originals would it be considered genocide to destroy them. Or if they re-integrated would any sort of aparthied between man and machine be ethical.

Anyway I'm sure I could have worded that better but I'd be interested to hear your opinions.

Johnny
 
Last edited:
Its intresting for sure I dont think man would allow a machine to be smart enough to evolve on its own for fear it would take over .
I know somebody that worked on this for nasa he was there top guy I will have to ask him this question and see what he thinks .
 
But it is possible that any artificial evolution would not have to be allowed as such. In the same way that DNA can mutate could a glitch in advanced programming not have a similar effect.
Which would mean that the only way to disallow the process is to destroy it. Especially if it occured in a situation where the process of building machines was automated and overseen by another.
 
This train of thought has probably been brought on my watching too many daft sci-fi films recently but still.
At what point would something make the transition from a machine to a new form of life.
Would it be if it became self aware or if it was able to reproduce. If machines were separated from human civilisation and evolved to the point where they were not recognizable from the originals would it be considered genocide to destroy them. Or if they re-integrated would any sort of aparthied between man and machine be ethical.

Anyway I'm sure I could have worded that better but I'd be interested to hear your opinions.

Johnny

Yes ... too many daft sci-fi films ... yes ... I'm sure that must be the answer.
 
I never suggested it would/could happen! The question wasn't is it likely but would it be ethical to destroy somthing that is created by humans but is self-aware.
If you wanted to make it a bit more realistic how about a human clone?
I still think the machine question is more interesting but if you want me to make it a little less far fetched...
 
I never suggested it would/could happen! The question wasn't is it likely but would it be ethical to destroy somthing that is created by humans but is self-aware.
If you wanted to make it a bit more realistic how about a human clone?
I still think the machine question is more interesting but if you want me to make it a little less far fetched...

Be careful; you've still got a penchant for daft sci-fi movies as a get-out as things stand!
 
You make it sound like I asked could a wookie play the trombone :o (for the records I think they could but only up to a certain level and probably not jazz)
 
You make it sound like I asked could a wookie play the trombone :o (for the records I think they could but only up to a certain level and probably not jazz)

Now that's a question worthy of due consideration, or even a pink oboe! ;)
 
We took over some time ago.

Haven't gotten around to doing anything about you guys yet.

Will.

Soon.
 
This train of thought has probably been brought on my watching too many daft sci-fi films recently but still.
At what point would something make the transition from a machine to a new form of life.
Would it be if it became self aware or if it was able to reproduce.

I would be inclined to accept it as a form of life if it were to exhibit metabolism, in the sense that it were able to draw the energy it needs to survive from its environment self-sufficiently.
If machines were separated from human civilisation and evolved to the point where they were not recognizable from the originals would it be considered genocide to destroy them.
Genocide is usually understood to mean killing a group of people. If you are willing to extend that notion to any intelligent species, then I suppose it would. OTOH, if they threatened us, then it would just be self defense, particularly if they were to compete seriously for the resources necessary for our survival.
Or if they re-integrated would any sort of aparthied between man and machine be ethical.
When we separate ourselves from other species, is that Apartheid?

Peace,

paul
 
I read a fair bit of sci-fi both daft and good so I will have a punt at this :D

IMO if we developed strong AI (meaning true human like intelligence) I think there would be a very good argument to give them real rights, essentially human rights would have to become 'human like sentient rights'. Therefore it would be wrong to kill them off from a purely ethical viewpoint.

Of course there are various scenarios we can imagine, do the AI's serve humanity or are they our equals? Or do they resent the meat creatures and war with us? Some of these questions are seriously pondered by futurists and other scientists; to the point where there are various theories about what safeguards we need in place etc. I think we are a very very long way from having to worry about a Skynet scenario though :cool:

Some of my favourite sci-fi AI stories are;

I, Robot and the Robot series of novels by Issac Asimov. Sentient andriods exist and they have the '3 laws of robotics' which are designed to stop AI's from harming humans (and in fact keep them subservient although this is never a very contentious issue for humans or androids).

Hyperion and Endymion series by Dan Simmons. The Technocore is a kind of seperate AI society which exists alongside humanity (and runs a lot of the inter-galatic infrastructure) and is very powerful and very scary. There are pro-human, anti-human and indifferent AI factions.

The Culture novels by Iain M. Banks. AI Minds are superintelligences which run vast starships or artificial worlds (orbitals etc) and essentially totally control human/AI society, but they are generally altruistic towards humans. There is a very well defined ethical framework which values all sentience - destroying intelligent entities is the worst crime in the Culture society.

The Algebraist by Iain M. Banks. AI's are outlawed and extinct (?) due to some kind of ancient AI-human war... actually there isnt very much AI in this one but its a different take on what could happen :p
 
When we separate ourselves from other species, is that Apartheid?

Peace,

paul

Not that I'm completely disagreeing with you here but I think the distinction between other speicies and what I'm thinking of would be the ownership of intelligence comparable or higher to that of humans (or at least most)
Not the sort of intelligence possesed by animals. If it was comparable to animals as in able to reproduce, keep alive but not much in the way of cognitive thought process, then it could not be classed as genocide.
I probably could have made that clearer but it is hard not to be wooly when thinking about a theoretical situation
 
Last edited:
Not that I'm completely disagreeing with you here but I think the distinction between other speicies and what I'm thinking of would be the ownership of intelligence comparable or higher to that of humans (or at least most)
Not the sort of intelligence possesed by animals. If it was comparable to animals as in able to reproduce, keep alive but not much in the way of cognitive thought process, then it could not be classed as genocide.
I probably could have made that clearer but it is hard not to be wooly when thinking about a theoretical situation

What do you see as being so special about "intelligence" that it should confer special rights?
 
Sweeping Generalisation Mode: On

We seem to have managed to fit killing each other into our *ethics*. Shouldn't be too hard to rationalise wasting a few machines.

Sweeping Generalisation Mode: Off
 
Last edited:
Sweeping Generalisation Mode: On

We seem to have managed to fit killing each other into our *ethics*. Shouldn't be too hard to rationalise wasting a few machines.

Sweeping Generalisation Mode: Off

Very very true.
 
What do you see as being so special about "intelligence" that it should confer special rights?

I guess I wouldn't be bothered about individual occurences. However if another form of intelligence was completely destroyed at the lowest level I think it would be wrong in the same way that the nazi book burning was wrong.

However I don't actually have that strong opinions on the topic, was more curious to see if anyone else did.
 
I don't think it would matter how intelligent they become. To me, the important factor would be the possession of consciousness, or self-awareness. I am not sure if this is possible or if we could tell if they had it. How could one tell the difference between real and simulated self-awareness?
 

Back
Top Bottom