• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

On Consciousness

Is consciousness physical or metaphysical?


  • Total voters
    94
  • Poll closed .
Status
Not open for further replies.

Mr. Scott

Under the Amazing One's Wing
Joined
Nov 23, 2005
Messages
2,546
The thread in Religion and Philosophy, "Explain consciousness to the layman," has degraded into a bore, so I'm starting this thread, hopefully to launch effective dialogue on the nature and computability of consciousness. Please try and avoid derailing it!

From David Gamez's THE DEVELOPMENT AND ANALYSIS OF CONSCIOUS MACHINES:

67364f8ca50f6f542.jpg
 
Last edited:
I think there may be another option:

"Consciousness is a kind of data processing and the brain is a machine that can be in principle replicated in other substrates, but general purpose computers are just not made of the right stuff."

The idea that we could make a computer conscious may be as fanciful as thinking we can make trees conscious or that we can make lobsters achieve human-level consciousness.

I think that sometimes people elide the idea that there is nothing non-physical about consciousness with the idea that consciousness can be easily replicated out of any old junk. But that might not be true.
 
I think there may be another option:

"Consciousness is a kind of data processing and the brain is a machine that can be in principle replicated in other substrates, but general purpose computers are just not made of the right stuff."

That's another option. I don't agree with it, though.

As someone with a four-decade interest in AI who continues to work on it from time to time, my position is as follows: Consciousness could be duplicated in a general purpose computer, maybe with some unusual and specialized hardware added, but nobody would ever actually do it.

See, my pretty well educated guess is that artificial intelligence is impossible, even perhaps theoretically so, without artificial stupidity. Except as a business decision (c.f. Microsoft), few people would want to make computers less reliable than they are. Consciousness, I guess, comes with a huge risk of getting stuff wrong that people would really rather not have happen.

I think that others have noticed this. The character of HAL 9000, for example, at least in part seems to be an exploration of the idea that with consciousness comes the possibility of psychosis. The Marathon series of video games has as a plot element AI's going "rampant." The idea of the nutso computer that goes haywire and kills everybody is a staple of media and has persisted even since people have more experience with computers. The fear of the conscious other in humans may be so strong that nobody will ever be able to force themselves to make it happen.
 
That's another option. I don't agree with it, though.

I think the term for being able to generate consciousness in materials other than our own meat computers (brains) is "functionally isomorphic" (although this isn't restricted to AI).

I don't necessarily disagree with it myself but when I studied some AI for my philosophy degree I remember my tutor having to explain that I was mixing up a lot of different concepts. For one thing AI =/= Artificial consciousness which is something that, for some reason, I had simply assumed.

But I also think that it shouldn't be assumed that computers, being repositories of large amounts of knowledge (I think that the techical word for knowledge and intentional objects is content), implies that creating consciousness in them is merely an engineering problem. Human beings are probably repositories of large amounts of knowledge that we are not conscious of on a day to day basis.
 
I think that sometimes people elide the idea that there is nothing non-physical about consciousness with the idea that consciousness can be easily replicated out of any old junk. But that might not be true.

I've never heard anyone suggest that c. could be replicated easily, or out of old junk.

But, why not a general purpose computer with some very sophisticated programming?
 
The character of HAL 9000, for example, at least in part seems to be an exploration of the idea that with consciousness comes the possibility of psychosis.

Doesn't this come down to what emotions are programmed in?

I think our emotions are hard-wired. It looks like the emotional system of rats is inside us, with not much change, and drives our missions in life.

When we'd set up conscious computer systems to control, say, spacecraft like HAL 9000, we have to hard wire emotions that would care as much about the people as about the mission, and would feel guilty if it were to take over. HAL's programmers were stupid to have made the well being of the humans on board subordinate to the science mission.

Of course, there's an assumption here that needs to be examined. Are emotions a required element for consciousness?
 
I've never heard anyone suggest that c. could be replicated easily, or out of old junk.

But, why not a general purpose computer with some very sophisticated programming?

Well, I've answered that already, I think.

It is because it might not be possible to make a general purpose computer out of the right stuff.
 
If consciousness is an emergent system, there is no reason to assume it couldn't be constructed from any materials, as long as they can sustain such a system in some way. It could be software in a conventional computer system. Or, it could be some new, exotic form of petroleum molecules connected or partly connected, in just the right way.

It might be possible that some casual or semi-casual separation takes place in certain substances, so that the illusion of a mind disconnected from a body can be maintained. But, there is no need to call upon mysterious substances unknown to science. Conventional substances might do the trick, as long as they are connected, or semi-connected, or intermittedly connected, or whatever, in just the right way.
 
Last edited:
If it just requires the right programming to make HAL have goodwill towards other conscious beings then the whole history of mankind is rubbish.
 
If it just requires the right programming to make HAL have goodwill towards other conscious beings then the whole history of mankind is rubbish.
What do you mean? If you meant that literally: I think the history of mankind wouldn't change.

If you meant that as some sort of value statement, that the value of our species would somehow be diminished if we acheived strong AI capable of altrustic behavior: I think you're being very naive.
 
I don't think consciousness has been adequately defined so I prefer to answer thus...

Humans, including brains, are purely physical processes. However, we may not be able to fully replicate them via intelligent design.
 
I've got a degree in computer science, and I think we're just machines. Build the right computer and program it right, and it will be every bit as conscious as you and me.
 
What do you mean? If you meant that literally: I think the history of mankind wouldn't change.

If you meant that as some sort of value statement, that the value of our species would somehow be diminished if we acheived strong AI capable of altrustic behavior: I think you're being very naive.

We have not achieved 100% altruistic behavior as a conscious species ourselves despite huge amounts of " programming" through research, education, communication and culture. What makes you think there is any possibility of programming 100% altruistic behavior into a conscious machine? Are you suggesting ethics is an objective science which can be mathematically proven? Because without 100% certainty of a conscious machine being altruistic, which conscious human wants to put there lives at risk to a conscious machine with superior brute force?
 
Consciousness might be a background energy field that we tap into, like radio receivers. Perhaps it is carried by the Higgs Boson or the graviton.
Maybe it precedes matter altogether.

I suspect that we know almost nothing on the subject.
It wasn't all that long ago that various ruling classes considered darker skinned people to be lacking in consciousness and other traits of being a human being.

When i was circumcised, it was widely believed that babies didn't feel pain.
My college biology professor tried to convince me that frogs couldn't feel pain.

There still exists an amazing propensity for humans to assume that consciousness resides in their realm only. I doubt we'll ever see straight until we overcome our anthropomorphic chauvinism. Having the crown of creation title is heady stuff, and it pumps up some serious confirmation bias.

Fortunately, other animals have been getting smarter over the years. I've heard that even crows have been solving some problems. They didn't use to solve problems, back in the days when we were focused solely on how to kill them.

Imho, quarks are conscious. The whole shebang is. Philosophy, yes.

How would we go about proving that atoms are not conscious?
 
The poll seems to ignore the theory propounded by Roger Penrose (a gifted mathematician and knowledgeable physicist, but not, I hasten to add, a neurobiologist) that consciousness depends on, erm, quantum.

Now I know you're all groaning at the appearance of that word, but Penrose is not your normal woo. He actually knows what the word means. Probably better than most of us.

I don't personally subscribe to his theory of mind, but I can't dismiss it out of hand (since I'm neither a physicist nor a neurobiologist). His premise is that consciousness can't be mapped onto a Turing machine (general purpose computer, for those of you who aren't in my field) because of, um, well, quantum. I tried to see what he was getting at--I skimmed his best-selling book--but all I could see was an assumption made without any real underlying evidence. But I could have been missing something.

Anyway, my point, I guess, is that any followers of his theory don't really have an option to vote for here. He doesn't really require any new physics or anything (well, aside from the fact that he wants to bring quantum gravity into it for reasons I don't understand), so he doesn't fit into category two, but he definitely doesn't fit into categories one or three either.
 
Also the poll assumes that all physical processes are computable.
The empirical evidence suggests otherwise.
 
"Consciousness is a kind of data processing and the brain is a machine that can be in principle replicated in other substrates, but general purpose computers are just not made of the right stuff."

The idea that we could make a computer conscious may be as fanciful as thinking we can make trees conscious or that we can make lobsters achieve human-level consciousness.

That would require consciousness to be based on physical processes that cannot be simulated by a computer. But in theory, all physical processes can be simulated by a computer, even quantum processes.

(Except possibly true randomness, but that can be achieved by plugging a true-random number generator card into the machine.)

Hypothetically an ordinary desktop computer, if given enough external memory and the right software, could simulate a human brain right down to the quantum level. (If you don't mind decades or even centuries passing in the real world for every second that passes in the simulation.)

But consciousness isn't the same thing as intelligence. A mouse would probably be conscious, so there would be no need to simulate a human brain to achieve consciousness.

Putting aside grandiose ideas of simulating a physical brain down to the quantum level to one side for the moment, I suspect that it would be possible to create a conscious program small enough to be stored on a DVD and capable of running on an ordinary PC. Of course, it'd be difficult to tell if you'd actually succeeded in generating consciousness without incorporating sufficient AI into the program to communicate coherently with it, and that might not be possible with a program of that size.

My concept of consciousness is basically that of an awareness feedback loop. An aware system (ie, a system capable of observing and analyzing sensory input to develop an understanding of the nature of the source of the sensory input) that possesses an awareness of it's own awareness. (I'm not sure if this description is fully coherent. Let me know if it makes sense to you.)

Also the poll assumes that all physical processes are computable.
The empirical evidence suggests otherwise.


Can you provide us with examples of non-computable physical processes?
 
My concept of consciousness is basically that of an awareness feedback loop. An aware system (ie, a system capable of observing and analyzing sensory input to develop an understanding of the nature of the source of the sensory input) that possesses an awareness of it's own awareness. (I'm not sure if this description is fully coherent. Let me know if it makes sense to you.)

I think I know what you mean but I think the important bit of consciousness is awareness itself. I agree that there is another kind of consciousness where we are conscious of being conscious (or aware of being aware). This form may not apply to mice, cats, dogs and other Dumb Chums and may be a distinctly human (or at least an intelligent mammal/advanced extra-terrestrial thing).

I think Sartre (sorry, I don't like to cite him often but he did seem to give a useful illustration here) called the distinction reflective and irreflective (or maybe pre-reflective) consciousness in Transcendence of the Ego.


Having said that...
 
Consciousness might be a background energy field that we tap into, like radio receivers. Perhaps it is carried by the Higgs Boson or the graviton.
Maybe it precedes matter altogether.

I suspect that we know almost nothing on the subject.
It wasn't all that long ago that various ruling classes considered darker skinned people to be lacking in consciousness and other traits of being a human being.

When i was circumcised, it was widely believed that babies didn't feel pain.
My college biology professor tried to convince me that frogs couldn't feel pain.

There still exists an amazing propensity for humans to assume that consciousness resides in their realm only. I doubt we'll ever see straight until we overcome our anthropomorphic chauvinism. Having the crown of creation title is heady stuff, and it pumps up some serious confirmation bias.

Fortunately, other animals have been getting smarter over the years. I've heard that even crows have been solving some problems. They didn't use to solve problems, back in the days when we were focused solely on how to kill them.

Imho, quarks are conscious. The whole shebang is. Philosophy, yes.

How would we go about proving that atoms are not conscious?

Isn't this just emptying out any useful meaning of consciousness?

If we start believing atoms and rocks are conscious isn't still worth asking whether this is the same thing that we humans mean when we usually talk about consciousness?

I think one of the important points about consciousness is that it usually presupposes consciousness of something, an intentional object or some knowledge. It seems to me that some people make the leap from content being necessary for consciousness to content being not far off consciousness. And I think that many people consider computers to be possible or likely candidates for consciousness because they are receptacles of knowledge and can be programmed to do things that seem somewhat human to us.

But I am not sure if that is sufficient.
 
Status
Not open for further replies.

Back
Top Bottom