- Understand neurons as statistical computations - "Statistical Computational Neuroscience"
- -Carew 2001 "Behavioural Neurobiology: The Cellular Organization of Natural Behavior"
- Barlow 1994 "What is the computational goal of the neocortex" in "Large-scale neuronal theories of the brain"
- Cooper, Intrator, Blais, Shouval 2004 "Theory of Cortical Plasticity"
- Marr and Poggio 1977 "From understanding computation to understanding neural circuitry"
- Barn Owl hearing, crab vision and movement, mussels
Saturday, July 10, 2010
Notes from Computing The Mind by Shimon Edelman
Things to look into:
Sunday, June 27, 2010
People to track down
Reviews
"This is a surprising book. On page 1, I was told that it contains a map of the London Underground; on page 275, I discovered why, and the pages accidentally viewed between had me trapped. Almost every one has an illustration, and the text tells its tale in lucid, well-focused paragraphs. You would learn a lot just from flipping its pages, and I can guarantee that you would also be entertained. If you have the slightest interest in the mind and how it works, you should not let this book slip past you." --Horace Barlow, FRS, Retired Royal Society Research Professor of Physiology, University of Cambridge
"...a unique blend of cognitive psychology, artificial intelligence, neuroscience, and philosophy. It will be well received in a survey course in cognitive psychology focusing on artificial intelligence, or the reverse."--PsycCritiques
"Edelman develops the thesis that the mind is computation, grounds it in neuroscience, supports it with empirical findings from diverse literatures, and brings it to bear on classical philosophical issues. Exemplifying the future of the science, the treatment is so inter-disciplinary as to not show favoritism towards a particular discipline. The writing is playful enough for a lay audience, accessible enough for advanced undergraduates, and informative enough for seasoned scientists."--Lawrence W. Barsalou, Samuel Candler Dobbs Professor of Psychology, Emory University
"A well-written and provocative attempt to integrate the detailed findings of psychology and neuroscience within a computational paradigm--where computation boils down not to computers but to organization." --Margaret A. Boden, Research Professor of Cognitive Science, University of Sussex
"This is an awe-inspiring book in multiple senses, but most importantly because it inspires its reader to be awed by the mysteries of the mind, and even more awed by what is known. Dr. Edelman forges a unified and coherent theory that nonetheless encompasses topics as diverse as computation, consciousness, perception, attention, memory, judgment, reasoning, creativity, language, ethics, truth, and beauty. The book offers a rigorous, computational account of cognition that is simultaneously witty, cultivated, and connected to the world."--Robert Goldstone, Chancellors Professor of Psychological and Brain Sciences, and Director of the Cognitive Science Program, Indiana University
"Shimon Edelman has written a fresh book about one of the main mysteries in science today: how the mind works and, in fact, what is the mind? It is easy and fun to read but at the same time almost every page reveals a new intriguing connection between computation, psychology and philosophy. I find it to be deep, challenging, and provocative--wonderful for any curious mind." --Tomaso Poggio, Eugene McDermott Professor, McGovern Institute, Computer Science and Artificial Intelligence Lab, and Brain Sciences Department, Massachusetts Institute of Technology
"As I have long held, philosophers and scientists studying the mind should be required to have first-hand experience of designing, implementing, testing, debugging, documenting, and analyzing working<$> cognitive systems. Simply reading about it is no substitute, though I think reading this book comes pretty close, and, in any case, should accompany doing it."-- Aaron Sloman, Honorary Professor of Artificial Intelligence and Cognitive Science, University of Birmingham
Friday, March 5, 2010
Pinkers definition of consciousness
P134 onwards talks about what consciousness is. He says main features of consciousness are sensory awareness, focal attention, emotional colouring and the will.
What I want to understand, is how I work. Consciousness is probably just one aspect. Maybe cows are conscious but they don't exhibit the same behaviour as me.
The behaviour I want explained are things like, how is a human able to
- think about what thought is
- decide where to go for lunch
- express their thoughts in language
- think and judge what they are doing
- have sense of self
- process visual information so they have this virtual reality
What adds complication is that these descriptions include implicit definitions of cognitive abilities. Maybe I can come up with a set of use cases which capture what the mind is. Perhaps what's missing from the use cases above is the idea of sentience. We are not just machines that process stuff 'in the dark'. The idea of sentience, that we see and feel and hear, that more is going on than just the mechanics of information processing... That us wats really interesting
What is the sentience stuff ? The sensation and perceiving ? The qualia? It's not just input. It's as if there is an observer set up, and the observer observes things in the form of sensations. But more than that I don't know. That's the core of our reality tho. These sensations and the virtual world created for us from our eyes.
We might be able to produce intelligent machines that can behave outwardly like us, but we probably wdnt be satisfied with saying they were equivalent to humans until they have that 'light on inside'
It's not enough for them to be programmed to say they feel pain. How do we actually make then feel pain? How do we actually make them think about themselves ? Worry? Want to dance?
We can't observe this stuff bc it's the internal working of the mind. Similarly we can't see that a program is working by running on the java virtual machine. Maybe sentience is itself some kind o machine, a platform through which the other stuff flows.
Imagine u were trying to find out how a java application worked by observing the electrical activity of a computer. Understanding how the mind works is harder.
p132
What good is consciousness? That is, what does the raw sensation of redness add to the train of billiard-ball events taking place in our neural computers? Any effect of perceiving something as red - noticing it against a sea of green, saying out loud "that's red", reminiscing about Santa Claus and fire engines, becoming agitated - could all be accomplished by pure information processing triggered by a sensor for long-wavelength light. Is consciousness an impotent side effect hovering over the symbols, like the lights flashing on a computer or the thunder that accompanies lightening? And if consciousness is useless - if a creature without it could negotiate the world as well as a creature with it - why would natural selection have favoured the conscious one?
What I want to understand, is how I work. Consciousness is probably just one aspect. Maybe cows are conscious but they don't exhibit the same behaviour as me.
The behaviour I want explained are things like, how is a human able to
- think about what thought is
- decide where to go for lunch
- express their thoughts in language
- think and judge what they are doing
- have sense of self
- process visual information so they have this virtual reality
What adds complication is that these descriptions include implicit definitions of cognitive abilities. Maybe I can come up with a set of use cases which capture what the mind is. Perhaps what's missing from the use cases above is the idea of sentience. We are not just machines that process stuff 'in the dark'. The idea of sentience, that we see and feel and hear, that more is going on than just the mechanics of information processing... That us wats really interesting
What is the sentience stuff ? The sensation and perceiving ? The qualia? It's not just input. It's as if there is an observer set up, and the observer observes things in the form of sensations. But more than that I don't know. That's the core of our reality tho. These sensations and the virtual world created for us from our eyes.
We might be able to produce intelligent machines that can behave outwardly like us, but we probably wdnt be satisfied with saying they were equivalent to humans until they have that 'light on inside'
It's not enough for them to be programmed to say they feel pain. How do we actually make then feel pain? How do we actually make them think about themselves ? Worry? Want to dance?
We can't observe this stuff bc it's the internal working of the mind. Similarly we can't see that a program is working by running on the java virtual machine. Maybe sentience is itself some kind o machine, a platform through which the other stuff flows.
Imagine u were trying to find out how a java application worked by observing the electrical activity of a computer. Understanding how the mind works is harder.
p132
What good is consciousness? That is, what does the raw sensation of redness add to the train of billiard-ball events taking place in our neural computers? Any effect of perceiving something as red - noticing it against a sea of green, saying out loud "that's red", reminiscing about Santa Claus and fire engines, becoming agitated - could all be accomplished by pure information processing triggered by a sensor for long-wavelength light. Is consciousness an impotent side effect hovering over the symbols, like the lights flashing on a computer or the thunder that accompanies lightening? And if consciousness is useless - if a creature without it could negotiate the world as well as a creature with it - why would natural selection have favoured the conscious one?
Wednesday, February 3, 2010
Why behavioural and computational theories are not enough - Searle's paper continued
He talks about other aspects of consciousness, such as intentionality (we can think of things that refer to the world), familiarity (we have preconceived notions of things and use them to categorize things) and mood. I think these things are perhaps aspects to the whole human conscious mind but they tie into consciousness rather than being a key part of it
The key part of it I think is the subjectivity. But this aspect riddles us and our current mode of science has no such notion
Not sure about the syntax/semantic distinction there but maybe it has some bearing
We don't even know how to describe subjectiveness in science do we?
The key part of it I think is the subjectivity. But this aspect riddles us and our current mode of science has no such notion
Some Common Mistakes about Consciousness
I would like to think that everything I have said so far is just a form of common sense. However, I have to report, from the battlefronts as it were, that the approach I am advocating to the study of consciousness is by no means universally accepted in cognitive science nor even neurobiology. Indeed, until quite recently many workers in cognitive science and neurobiology regarded the study of consciousness as somehow out of bounds for their disciplines. They thought that it was beyond the reach of science to explain why warm things feel warm to us or why red things look red to us. I think, on the contrary, that it is precisely the task of neurobiology to explain these and other questions about consciousness. Why would anyone think otherwise? Well, there are complex historical reasons, going back at least to the seventeenth century, why people thought that consciousness was not part of the material world. A kind of residual dualism prevented people from treating consciousness as a biological phenomenon like any other. However, I am not now going to attempt to trace this history.
Instead I am going to point out some common mistakes that occur when people refuse to address consciousness on its own terms. The characteristic mistake in the study of consciousness is to ignore its essential subjectivity and to try to treat it as if it were an objective third person phenomenon. Instead of recognizing that consciousness is essentially a subjective, qualitative phenomenon ...
The two most common mistakes about consciousness are to suppose that it can be analysed behavioristically or computationally. The Turing test disposes us to make precisely these two mistakes, the mistake of behaviorism and the mistake of computationalism. It leads us to suppose that for a system to be conscious, it is both necessary and sufficient that it has the right computer program or set of programs with the right inputs and outputs. ... A traditional objection to behaviorism was that behaviorism could not be right because a system could behave as if it were conscious without actually being conscious. There is no logical connection, no necessary connection between inner, subjective, qualitative mental states and external, publicly observable behavior. Of course, in actual fact, conscious states characteristically cause behavior. But the behavior that they cause has to be distinguished from the states themselves.
The same mistake is repeated by computational accounts of consciousness. Just as behavior by itself is not sufficient for consciousness, so computational models of consciousness are not sufficient by themselves for consciousness. There is a simple demonstration that the computational model of consciousness is not sufficient for consciousness. I have given it many times before so I will not dwell on it here. Its point is simply this: Computation is defined syntactically. It is defined in terms of the manipulation of symbols. But the syntax by itself can never be sufficient for the sort of contents that characteristically go with conscious thoughts. Just having zeros and ones by themselves is insufficient to guarantee mental content, conscious or unconscious. This argument is sometimes called `the Chinese room argument' because I originally illustrated the point with the example of the person who goes through the computational steps for answering questions in Chinese but does not thereby acquire any understanding of Chinese.[1]
The point of the parable is clear but it is usually neglected. Syntax by itself is not sufficient for semantic content. In all of the attacks on the Chinese room argument, I have never seen anyone come out baldly and say they think that syntax is sufficient for semantic content.
Not sure about the syntax/semantic distinction there but maybe it has some bearing
We don't even know how to describe subjectiveness in science do we?
Mind phenomena as "higher level features of the brain"; Is awareness sufficient for self-awareness?
http://myweb.lsbu.ac.uk/~teamcxx/hkbs/probcons.pdf
he says it is separate from attention, self-consciousness and knowledge.
can it really be separate from self-consciousness tho? i cant imagine being conscious without being self conscious. maybe something like "ow there is pain" but not thinking that u are in pain. maybe like sometimes when im really tired and agitated, im more aware of the agitation than my relation to it.
but i think that is really what i was thinking, this "subjective experience" or subjective aspect of feeling that cooccurs with the sensing of something. still , it is hard to disassociate it from a sense of self. its like the self is the observer , and enables the sensations to be possible.
if self-awareness is required for subjective experience, then by that argument animals which dont have self-awareness wouldnt feel pain. do insects feel pain? every creature seems to avoid it. but some creatures cry out in pain like we do.
or maybe self-awareness is more prevalent amongst animals than we think - however some animals cannot recognise themselves in a mirror. do these animals cry out in pain? assuming crying out in pain means u have subjective experience.
From the way searle describes it below, it sounds like a self is involved
However, though consciousness is a biological phenomenon, it has some important features that other biological phenomena do not have. The most important of these is what I have called its `subjectivity'. There is a sense in which each person's consciousness is private to that person, a sense in which he is related to his pains, tickles, itches, thoughts and feelings in a way that is quite unlike the way that others are related to those pains, tickles, itches, thoughts and feelings.
i also think it has something to do with attention or awareness - we cant feel something if we are not aware of it.
More pointedly, does the claim that there is a causal relation between brain and consciousness commit us to a dualism of `physical' things and `mental' things? The answer is a definite no. Brain processes cause consciousness but the consciousness they cause is not some extra substance or entity. It is just a higher level feature of the whole system.
These are almost exactly my thoughts - these mental things - thoughts, ideas, beliefs, statements - they can be physical things - an arrangement of physical things over time - eg. state machines. maybe even the engendering of a subjective experience is one of these physical things. what kind of physical things are they though? they seem to be the kind of physical entities that are processes. except they are not just to achieve something, they are something. what other kind of physical things are like a thought?
if we were programmed to think pain felt bad ... feelings are thoughts ... how is it possible for physical processes to give rise to a self that feels and thinks? its not obvious from my knowledge of computers and programming.
searle compares consciousness as a high feature of the brain to liquidity as a high feature of water ... but there is a big difference there, what is the difference exactly?
he says it is separate from attention, self-consciousness and knowledge.
can it really be separate from self-consciousness tho? i cant imagine being conscious without being self conscious. maybe something like "ow there is pain" but not thinking that u are in pain. maybe like sometimes when im really tired and agitated, im more aware of the agitation than my relation to it.
but i think that is really what i was thinking, this "subjective experience" or subjective aspect of feeling that cooccurs with the sensing of something. still , it is hard to disassociate it from a sense of self. its like the self is the observer , and enables the sensations to be possible.
if self-awareness is required for subjective experience, then by that argument animals which dont have self-awareness wouldnt feel pain. do insects feel pain? every creature seems to avoid it. but some creatures cry out in pain like we do.
or maybe self-awareness is more prevalent amongst animals than we think - however some animals cannot recognise themselves in a mirror. do these animals cry out in pain? assuming crying out in pain means u have subjective experience.
From the way searle describes it below, it sounds like a self is involved
However, though consciousness is a biological phenomenon, it has some important features that other biological phenomena do not have. The most important of these is what I have called its `subjectivity'. There is a sense in which each person's consciousness is private to that person, a sense in which he is related to his pains, tickles, itches, thoughts and feelings in a way that is quite unlike the way that others are related to those pains, tickles, itches, thoughts and feelings.
i also think it has something to do with attention or awareness - we cant feel something if we are not aware of it.
More pointedly, does the claim that there is a causal relation between brain and consciousness commit us to a dualism of `physical' things and `mental' things? The answer is a definite no. Brain processes cause consciousness but the consciousness they cause is not some extra substance or entity. It is just a higher level feature of the whole system.
These are almost exactly my thoughts - these mental things - thoughts, ideas, beliefs, statements - they can be physical things - an arrangement of physical things over time - eg. state machines. maybe even the engendering of a subjective experience is one of these physical things. what kind of physical things are they though? they seem to be the kind of physical entities that are processes. except they are not just to achieve something, they are something. what other kind of physical things are like a thought?
if we were programmed to think pain felt bad ... feelings are thoughts ... how is it possible for physical processes to give rise to a self that feels and thinks? its not obvious from my knowledge of computers and programming.
searle compares consciousness as a high feature of the brain to liquidity as a high feature of water ... but there is a big difference there, what is the difference exactly?
Friday, January 8, 2010
How are we different from machines which are purely mechanical devices?
What is the internal behaviour? What is the "light on inside"?
How did consciousness evolve? How conscious are other animals?
How are we different from machines which are purely mechanical devices?
How is it possible for me to claim that I have awareness?
Voluntary movement and thought? How is that possible?
How cd a machine recognise itself?
Autonomy - we think, and move of our own volition , when there is nothing around. What kind of machine, which is perhaps programmed to constantly process input, heed biological conditions and pursue it's goals... What kind of machine could start questioning why it was doing those things.
Could a machine question things? Could u program something to question things? how wd it then start questioning it's own behaviour? It wd need to be able to observe it's own behaviour, and have a way of recognizing itself.
The Chinese room argument - even if you could program a machine to produce the same human behaviour, it doesn't mean it is the same system and therfore has the properties of self- awareness like we do. Chess playing machines for example , they can do what we can, but they are built completely differently. We can also do a lot more things.
What if our exact behaviour could be mapped out in terms of inputs, starting state, data and rules? If our behaviour is deterministic, then in theory something could emulate that external behaviour. But things would not be happening the same way, involving the same machinery, so you could not claim the internal behaviour was the same.
Does 'internal behaviour' make sense?if yes, how do we describe this internal behaviour as part of the physical world?
By 'internal behaviour' I mean sensations and self-awareness. Can u have awareness without self-awareness?
What are these ghostly properties 'qualia' ? Why do we think there is more going on than mechanical computation? What is consciousness?
I can imagine a computer that can process visual input such that it can detect objects. It can be programmed to say that it can see an object when an object has been detected.
When you ask the computer if it can see something, it will say yes, but there is nothing going on other than 'mechanical' computation there.
We can imagine things we see - form a picture in our mind. Visual representation.
Could u program a computer to recognise it's own output?
What is this stuff that we think we experience that is more than just blind mechanical cause and effect chains. I say I can 'see' really see. I say I can see this room and myself and all these colours, and describe what Im thinking, but what if we are just wired to think and say that how do I
How did consciousness evolve? How conscious are other animals?
How are we different from machines which are purely mechanical devices?
How is it possible for me to claim that I have awareness?
Voluntary movement and thought? How is that possible?
How cd a machine recognise itself?
Autonomy - we think, and move of our own volition , when there is nothing around. What kind of machine, which is perhaps programmed to constantly process input, heed biological conditions and pursue it's goals... What kind of machine could start questioning why it was doing those things.
Could a machine question things? Could u program something to question things? how wd it then start questioning it's own behaviour? It wd need to be able to observe it's own behaviour, and have a way of recognizing itself.
The Chinese room argument - even if you could program a machine to produce the same human behaviour, it doesn't mean it is the same system and therfore has the properties of self- awareness like we do. Chess playing machines for example , they can do what we can, but they are built completely differently. We can also do a lot more things.
What if our exact behaviour could be mapped out in terms of inputs, starting state, data and rules? If our behaviour is deterministic, then in theory something could emulate that external behaviour. But things would not be happening the same way, involving the same machinery, so you could not claim the internal behaviour was the same.
Does 'internal behaviour' make sense?if yes, how do we describe this internal behaviour as part of the physical world?
By 'internal behaviour' I mean sensations and self-awareness. Can u have awareness without self-awareness?
What are these ghostly properties 'qualia' ? Why do we think there is more going on than mechanical computation? What is consciousness?
I can imagine a computer that can process visual input such that it can detect objects. It can be programmed to say that it can see an object when an object has been detected.
When you ask the computer if it can see something, it will say yes, but there is nothing going on other than 'mechanical' computation there.
We can imagine things we see - form a picture in our mind. Visual representation.
Could u program a computer to recognise it's own output?
What is this stuff that we think we experience that is more than just blind mechanical cause and effect chains. I say I can 'see' really see. I say I can see this room and myself and all these colours, and describe what Im thinking, but what if we are just wired to think and say that how do I
Sunday, January 3, 2010
What is information?
"Information is a correlation btw two things produced by a lawful process... Causes leave effects
Symbols, their arrangement, somehow preserve the information that existed in the real world phenomena.
Intelligence only makes sense when there is a goal.
"in our daily lives we all predict and explain other ppls behaviour from what we think they know and what we think. They want. "
Information ... There is physical matter. Arrangement of matter, due to causes. Properties of this matter causes the changing of "internal state" .. The arrangement, or the information is retained in another arrangement of matter. This internal arrangement is how we store information.
What are symbols? I think they are different from internal storage of information. The pattern of ones and zeroes representing a number that u feed into pins of a circuit - is this pattern of zeroes and ones a symbol? Perhaps a symbol is a configuration of matter or state. How does symbol relate to information?
Symbols both stand for something and mechanically cause things to happen.
Do we start off with a set of beliefs and desires which are not learned? if so, how are they encoded ?
Symbols, their arrangement, somehow preserve the information that existed in the real world phenomena.
Intelligence only makes sense when there is a goal.
"in our daily lives we all predict and explain other ppls behaviour from what we think they know and what we think. They want. "
Information ... There is physical matter. Arrangement of matter, due to causes. Properties of this matter causes the changing of "internal state" .. The arrangement, or the information is retained in another arrangement of matter. This internal arrangement is how we store information.
What are symbols? I think they are different from internal storage of information. The pattern of ones and zeroes representing a number that u feed into pins of a circuit - is this pattern of zeroes and ones a symbol? Perhaps a symbol is a configuration of matter or state. How does symbol relate to information?
Symbols both stand for something and mechanically cause things to happen.
Do we start off with a set of beliefs and desires which are not learned? if so, how are they encoded ?
Subscribe to:
Posts (Atom)