What are qualia (a sensation or property thereof)? They are very real for us? Where do they belong in the real world phenomena? Are they not like a thought? Aren't thoughts the same kind of intangible thing? Like a sentence? How are they different from a sentence? 0ne difference is thAt a sentence has an observable representation - the sounds we utter.
How do we form an internal representation from sensory input , and how do we trigger it again, so that we can think of something when it's not in front of us?
Saturday, December 26, 2009
Tuesday, December 15, 2009
Iconic transfer, consciousness requires control and ownership - Igor Aleksander's thoughts
He thinks machines can have meaningful sensations by something called "iconic transfer".
Igor says consciousness requires the organism to have a sense of control and ownership.
What is a 'sensation' or feeling? Can an artificial system have sensations and feelings? Like 'consciousness', the word 'sensation' is one for which we develop a meaning based entirely on experience
--Igor Aleksander "A Neurocomputational View of Consciousness"
Maybe I need to read some Wittgenstein and understand the "non-essetialism of language"
Consciousness is in need of a scientific enquiry.
Consciousness is altered by drugs. Why/how?
Igor thinks we should study consciousess with artifical neurons bcos there aren't good enough measuring techniques to study real neurons.
He thinks machines can have meaningful sensations by something called "iconic transfer".
building a mental image of what the world is like
Igor says consciousness requires the organism to have a sense of control and ownership. Are these the things missing from that Chinese room AI?
Igor says consciousness requires the organism to have a sense of control and ownership.
What is a 'sensation' or feeling? Can an artificial system have sensations and feelings? Like 'consciousness', the word 'sensation' is one for which we develop a meaning based entirely on experience
--Igor Aleksander "A Neurocomputational View of Consciousness"
Maybe I need to read some Wittgenstein and understand the "non-essetialism of language"
Consciousness is in need of a scientific enquiry.
Consciousness is altered by drugs. Why/how?
Igor thinks we should study consciousess with artifical neurons bcos there aren't good enough measuring techniques to study real neurons.
He thinks machines can have meaningful sensations by something called "iconic transfer".
building a mental image of what the world is like
Igor says consciousness requires the organism to have a sense of control and ownership. Are these the things missing from that Chinese room AI?
Can we imply consciousness through functional equivalence?
I was chatting to mark tonight about the consciousness stuff. I ended with idea that the subjective experiences seem unable to be proven, but what if we could show two minds to be equivalent, by way of modelling their execution or function.
Mark said that we can't even do that for two humans, but we especially can't do that between a human mind, and an artificial mind, because an artificial mind had an intelligent designer, that is, it's function is completely determined, whereas we cannot know if we have fully captured the functioning of the human mind - there might be some aspect that we have missed, and we would never know - perhaps this missed aspect is significant in making the two systems unequivalent.
I said, well what about "for all practical purposes" , I mean, we are able to explain things at a molecular level, like ATP production or photosynthesis, and we could just use occam's razor and assume that the simplest working model is good enough. I know this still doesn't preclude that perhaps there is a process that we aren't aware of, that is responsible for this subjective experience we are trying to pinpoint.
But the fact that Mark went so far as to say that you can't assume two human minds are equivalent, annoyed me. He also started making some point about how perhaps I think they are equivalent because the alternative, ie. accusing someone of not being sentient, is undesirable.
I started getting pissed off at him, and didn't want to continue the argument. I felt that he wasn't taking my arguments seriously and at times completely missing the point I was trying to make.
It is not helpful to get emotional when arguing. I need to practice not getting emotional, and instead dealing with the thoughts and ideas that come up, and see where they lead.
One thing I did find interesting about what Mark said - he said something about how we seem to only require a theory when there is some deviation that needs to be explained. Eg. we assume that other people are sentient as well because there is no reason not to? something like that
Mark said that we can't even do that for two humans, but we especially can't do that between a human mind, and an artificial mind, because an artificial mind had an intelligent designer, that is, it's function is completely determined, whereas we cannot know if we have fully captured the functioning of the human mind - there might be some aspect that we have missed, and we would never know - perhaps this missed aspect is significant in making the two systems unequivalent.
I said, well what about "for all practical purposes" , I mean, we are able to explain things at a molecular level, like ATP production or photosynthesis, and we could just use occam's razor and assume that the simplest working model is good enough. I know this still doesn't preclude that perhaps there is a process that we aren't aware of, that is responsible for this subjective experience we are trying to pinpoint.
But the fact that Mark went so far as to say that you can't assume two human minds are equivalent, annoyed me. He also started making some point about how perhaps I think they are equivalent because the alternative, ie. accusing someone of not being sentient, is undesirable.
I started getting pissed off at him, and didn't want to continue the argument. I felt that he wasn't taking my arguments seriously and at times completely missing the point I was trying to make.
It is not helpful to get emotional when arguing. I need to practice not getting emotional, and instead dealing with the thoughts and ideas that come up, and see where they lead.
One thing I did find interesting about what Mark said - he said something about how we seem to only require a theory when there is some deviation that needs to be explained. Eg. we assume that other people are sentient as well because there is no reason not to? something like that
What are sensations? How can we prove their existence?
Sensation is a subjective experience. Take pain as an example. Pain is a sensation. I am not just avoiding things mechanically. I am not just mechanically saying I feel pain , I actually feel pain, I cannot deny it.
Cd consciousness be considered a sensation? It is the sensation we have of being alive. Of perceiving. How does it relate to awareness and self-awareness?
Cd consciousness be considered a sensation? It is the sensation we have of being alive. Of perceiving. How does it relate to awareness and self-awareness?
Monday, December 14, 2009
Understanding has something to do with self-awareness
Maybe one of the problems on defining Understanding and accepting the idea of an ai understanding something is to do with our common definition of the word understanding. It is sonething that humans do, that requires self-awareness and intelligence.
We are puzzled by awareness and self awareness. It is odd how the self awareness thing interacts with the understanding process, ie . Understanding is in part a conscious process.
We are puzzled by awareness and self awareness. It is odd how the self awareness thing interacts with the understanding process, ie . Understanding is in part a conscious process.
What is understanding? What would it take for a machine to understand?
What is understanding? What do we mean when we say we understand something but a machine does not understand it, even if it can give the same answers? What would it take for a machine to understand it?
Subscribe to:
Posts (Atom)