Wednesday, February 3, 2010

Why behavioural and computational theories are not enough - Searle's paper continued

He talks about other aspects of consciousness, such as intentionality (we can think of things that refer to the world), familiarity (we have preconceived notions of things and use them to categorize things) and mood. I think these things are perhaps aspects to the whole human conscious mind but they tie into consciousness rather than being a key part of it

The key part of it I think is the subjectivity. But this aspect riddles us and our current mode of science has no such notion

Some Common Mistakes about Consciousness

I would like to think that everything I have said so far is just a form of common sense. However, I have to report, from the battlefronts as it were, that the approach I am advocating to the study of consciousness is by no means universally accepted in cognitive science nor even neurobiology. Indeed, until quite recently many workers in cognitive science and neurobiology regarded the study of consciousness as somehow out of bounds for their disciplines. They thought that it was beyond the reach of science to explain why warm things feel warm to us or why red things look red to us. I think, on the contrary, that it is precisely the task of neurobiology to explain these and other questions about consciousness. Why would anyone think otherwise? Well, there are complex historical reasons, going back at least to the seventeenth century, why people thought that consciousness was not part of the material world. A kind of residual dualism prevented people from treating consciousness as a biological phenomenon like any other. However, I am not now going to attempt to trace this history. 

Instead I am going to point out some common mistakes that occur when people refuse to address consciousness on its own terms.  The characteristic mistake in the study of consciousness is to ignore its essential subjectivity and to try to treat it as if it were an objective third person phenomenon. Instead of recognizing that consciousness is essentially a subjective, qualitative phenomenon ... 

The two most common mistakes about consciousness are to suppose that it can be analysed behavioristically or computationally. The Turing test disposes us to make precisely these two mistakes, the mistake of behaviorism and the mistake of computationalism. It leads us to suppose that for a system to be conscious, it is both necessary and sufficient that it has the right computer program or set of programs with the right inputs and outputs. ... A traditional objection to behaviorism was that behaviorism could not be right because a system could behave as if it were conscious without actually being conscious.  There is no logical connection, no necessary connection between inner, subjective, qualitative mental states and external, publicly observable behavior. Of course, in actual fact, conscious states characteristically cause behavior. But the behavior that they cause has to be distinguished from the states themselves.

The same mistake is repeated by computational accounts of consciousness. Just as behavior by itself is not sufficient for consciousness, so computational models of consciousness are not sufficient by themselves for consciousness.  There is a simple demonstration that the computational model of consciousness is not sufficient for consciousness. I have given it many times before so I will not dwell on it here. Its point is simply this: Computation is defined syntactically. It is defined in terms of the manipulation of symbols. But the syntax by itself can never be sufficient for the sort of contents that characteristically go with conscious thoughts. Just having zeros and ones by themselves is insufficient to guarantee mental content, conscious or unconscious.  This argument is sometimes called `the Chinese room argument' because I originally illustrated the point with the example of the person who goes through the computational steps for answering questions in Chinese but does not thereby acquire any understanding of Chinese.[1]

The point of the parable is clear but it is usually neglected. Syntax by itself is not sufficient for semantic content. In all of the attacks on the Chinese room argument, I have never seen anyone come out baldly and say they think that syntax is sufficient for semantic content. 



Not sure about the syntax/semantic distinction there but maybe it has some bearing

We don't even know how to describe subjectiveness in science do we?

Mind phenomena as "higher level features of the brain"; Is awareness sufficient for self-awareness?

http://myweb.lsbu.ac.uk/~teamcxx/hkbs/probcons.pdf

he says it is separate from attention, self-consciousness and knowledge.

can it really be separate from self-consciousness tho? i cant imagine being conscious without being self conscious.  maybe something like "ow there is pain" but not thinking that u are in pain.  maybe like sometimes when im really tired and agitated, im more aware of the agitation than my relation to it.

but i think that is really what i was thinking, this "subjective experience" or subjective aspect of feeling that cooccurs with the sensing of something.  still , it is hard to disassociate it from a sense of self.  its like the self is the observer , and enables the sensations to be possible.

if self-awareness is required for subjective experience, then by that argument animals which dont have self-awareness wouldnt feel pain.  do insects feel pain? every creature seems to avoid it.  but some creatures cry out in pain like we do.

or maybe self-awareness is more prevalent amongst animals than we think - however some animals cannot recognise themselves in a mirror.  do these animals cry out in pain?  assuming crying out in pain means u have subjective experience.

From the way searle describes it below, it sounds like a self is involved
However, though consciousness is a biological phenomenon, it has some important features that other biological phenomena do not have. The most important of these is what I have called its `subjectivity'. There is a sense in which each person's consciousness is private to that person, a sense in which he is related to his pains, tickles, itches, thoughts and feelings in a way that is quite unlike the way that others are related to those pains, tickles, itches, thoughts and feelings.

i also think it has something to do with attention or awareness - we cant feel something if we are not aware of it.

More pointedly, does the claim that there is a causal relation between brain and consciousness commit us to a dualism of `physical' things and `mental' things? The answer is a definite no. Brain processes cause consciousness but the consciousness they cause is not some extra substance or entity. It is just a higher level feature of the whole system.

These are almost exactly my thoughts - these mental things - thoughts, ideas, beliefs, statements - they can be physical things - an arrangement of physical things over time - eg. state machines.  maybe even the engendering of a subjective experience is one of these physical things.  what kind of physical things are they though? they seem to be the kind of physical entities that are processes. except they are not just to achieve something, they are something. what other kind of physical things are like a thought?

if we were programmed to think pain felt bad ... feelings are thoughts ... how is it possible for physical processes to give rise to a self that feels and thinks? its not obvious from my knowledge of computers and programming.

searle compares consciousness as a high feature of the brain to liquidity as a high feature of water ... but there is a big difference there, what is the difference exactly?