• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Turing Test Questioned

RVM45

Scholar
Joined
Jun 24, 2008
Messages
96
The thread about:"Would you marry a robot?" got me to wondering about this again.

According to Turing; if the Machine that you're conversing with via keyboard is absolutely indistinguishable from a human in the same situation; then it is intellegent enough to qualify as sentient.

This is the Gedanken machine that I call a "Semantisizer." Fundamentally it is all talk. It can:


A.} Recognizing any familiar word.

B.} Defining any familiar word in terms of other familiar words--although if persued persistantly enough, the definitions become circular--as indeed will happen anytime you play the Definition Game long ennough.

Well, so far its on parr with a dictionary.

C.} It will ask for definitions of new words. Failing to get a reasonable definition; it is capable of hypothesizing a reasonable definition; refining the definition over time; and eventually accepting the new word into it's use vocabulary.

D.} We do a far better job of codifying the rules of grammer; syntax; symantics; speech and conversation than has ever been done heretofore.

E.} We add in logic and game playing theory.

F.} We manage to make the thing self-programming for conversational strategies; but strategies that cause people to converse with it longer are the preferred strategies.

Presumably, as it perfects it's conversational strategies, it will become more pleasant to interact with--leading to longer conversations. That might require tweaking eventually.

We give it pleny of time to talk to very many people over a very long period, to allow it to fully evolve its emergent conversational strategies.

It can definately pass the Turing Test--but would it truly be sentient?

I kinda doubt it. The only thing it understands is words.

If you could somehow contrive to pour direct experience into it--it couldn't distinguish between the taste of a stawberry or the sensation of being burned alive.

It could define "Orgasm" for you; learnedly discuss the biology; ethics; metaphysics and literary treatment of the Orgasm--but if you poured an Orgasm into it; it would have no idea if it had just experience Orgasm; Ice Cream; a Drill Sargent or Ennui...

Stipulated: if you kept pouring sensory material into it; it might very well learn. Never Mind. It is not terribly sentient NOW.

So can a sufficiently sophisticated word processor--that has no other sensory or data processing abilities, be classified as SENTIENT?

.....RVM45 :cool:
 
Last edited:
"If you could somehow contrive to pour direct experience into it--it couldn't distinguish between the taste of a stawberry or the sensation of being burned alive."

You are asking if it has qualia, which are required for consciousness by most philosophers. Whether a sufficiently powerful computer would have consciousness is a hot topic. It isn't known how it is achieved in a brain.

Leon
 
RVM45, you sure like the word Gedanken!

Your ultimate question in the post is quite interesting, and one that is not answerable at this point. I'm not sure we'll ever know the answer. If a machine is programmed to have a particular response to certain stimuli, it can imitate what humans do. If a machine eventually tells its operator that it is aware of itself, how do we know that it is actually sentient, rather than just giving the response it was programmed to give?

I must take issue with your OP heading. Your question has nothing to do with the Turing Test. The Turing Test is about a machine meeting some arbitrary level of conversational intelligence. The arbitrary part is "can it fool an actual person sitting at a terminal into believing they are talking to another person." This scenario does not involve self-awareness. So saying that the Turing Test is "questioned" is misleading.

Perhaps what you want is a Voight-Kampff test. ;)

~ggep~
 
Last edited:
Leon Heller-You did a masterful job of restating my question more concisely.

I've heard(read)the term "Qualia"; but hadn't encounterd it lately--it was way down in the back of my inactive vocabulary.

However, I think I was asking: Can Qualia be achieved soley through ability to manipulate words; with no other senses or abilities necessary?

Which is unanswerable at this state of the Art; but fascinating nonetheless.

Indeed, how do we know that the conciousness of some people isn't faked--or rather simulated by trophisms? Presumably you know you are self aware--being able to question it--in your own mind, confirms it to you.

It seems unlikely that you are the only one with this property. But if the combination of heredity and enviorment only suceeds in creating self-awareness in some of us--while others merely perfect their best simulation...

How would you ever know?

.....RVM45 :cool:
 
So can a sufficiently sophisticated word processor--that has no other sensory or data processing abilities, be classified as SENTIENT?

Presumably as much as anybody can. Even if you met me face-to-face, you wouldn't know if I was sentient. You only know that I appear to be.

Additionally, I agree with goodguyseatpie, in that the op title is a bit misleading. The Turing test is not intended to confirm sentience - much less a whole simulated person. It's only intended to recognize succesful artificial human intelligence.

As a counterexample, I think we can accept that aliens can be sentient and intelligent even if they don't understand many human experiences or even entire sensual concepts (maybe they're deaf or blind or whatever).
 
RVM45, you are conflating several different questions – all of them fascinating, seriously difficult, and of fundamental importance to our understanding of intelligence (artificial and otherwise):
  • Does intelligence imply thought?
  • Does thought imply awareness?
  • Will a sufficiently complex information-processing machine exhibit human-like 'thinking'?
  • Does such a sufficiently complex machine need to be biological?
  • Is there a meaningful difference between 'apparent' intelligence (or thought) and the real thing?
  • Would an artificial (non-biological) brain 'feel' like a real (biological) one?
Turing didn't mention 'sentience' when he proposed his test – in fact, that's very specifically what his claims were not about. He was asserting that:
  • We don't know what we mean by 'thought' and 'intelligence', or even how to investigate the question. (Could hardly have been more right.)
  • Only an operational definition can be meaningful. (Much too pessimistic, imo.)
Read his 1950 paper Computing Machinery and Intelligence.

I propose to consider the question, "Can machines think?" This should begin with definitions of the meaning of the terms "machine" and "think."
...
 
Last edited:
I kinda doubt it. The only thing it understands is words.
You may have started with a machine that "only understands words" but you added quite a bit of stuff after that and wound up with a machine capable of conversing. It would have to have an understanding that goes beyond mere "words" to do that.
If you could somehow contrive to pour direct experience into it--it couldn't distinguish between the taste of a stawberry or the sensation of being burned alive.
It would have a direct experience of conversing. Why wouldn't that be enough? To do that at the level you described it would presumably need to see and/or hear.

If you lose your ability to taste a strawberry will you cease to be sentient?
 
Leon Heller-You did a masterful job of restating my question more concisely.

I've heard(read)the term "Qualia"; but hadn't encounterd it lately--it was way down in the back of my inactive vocabulary.

However, I think I was asking: Can Qualia be achieved soley through ability to manipulate words; with no other senses or abilities necessary?

Which is unanswerable at this state of the Art; but fascinating nonetheless.

Indeed, how do we know that the conciousness of some people isn't faked--or rather simulated by trophisms? Presumably you know you are self aware--being able to question it--in your own mind, confirms it to you.

It seems unlikely that you are the only one with this property. But if the combination of heredity and enviorment only suceeds in creating self-awareness in some of us--while others merely perfect their best simulation...

How would you ever know?

.....RVM45 :cool:

My view is that I only really know about myself. I believe that I am conscious but I don't have any way of knowing that anyone else is. They behave as if they are,and tell me they are, but that doesn't really prove anything.

Leon
 

Back
Top Bottom