A Conversation for John Searle's 'Chinese Room' Argument
- 1
- 2
Intelligence by lookup table
Gnomon - time to move on Posted Aug 1, 2003
You seem to be highlighting the fact that you are a conscious individual by pointing out that you would object to being replaced by a machine. My point was that a machine might also object to being replaced, so it may be that the machine is also a conscious individual.
All arguments against machines thinking seem to go along the lines of:
Machines can't think. Don't be ridiculous, it is absurd! Therefore machines can't think.
Intelligence by lookup table
Joe Otten Posted Aug 1, 2003
OK, but what do you think of the objection? Is it reasonable? Would you object to being killed and replaced by a functionally equivalent machine? Why?
Intelligence by lookup table
Gnomon - time to move on Posted Aug 1, 2003
It's an interesting question. Many people have speculated on it. If I was replaced by a machine with all my memories, I'd probably just feel that I had been zapped into the body of a machine.
Something similar is the Star Trek 'Transporter' which in effect dismantles a person at the trasmitting station and constructs a copy of the person at the receiving station, complete with memories. Has the person been moved? Has a murder been committed? What happens if something goes wrong and the original is not dismantled, so that there are two versions of the person? This happened in one episode of Star Trek The Next Generation.
Roger Penrose, the quantum physicist speculated that there was something in the brain which operates at the quantum level to create consciousness. This would then be unreproduceable due to Heisenberg's Uncertainty Principle, so the question doesn't arise. That's a cop out in my opinion.
Intelligence by lookup table
Joe Otten Posted Aug 1, 2003
I'm tempted here to invoke the concept of identity. Its use in software is clear - two objects or records might contain the same data, and therefore behave identically, but have a different identity (a different primary key or memory location). From the users point of view they are functionally identical. The fact that the user never gets to see what the primary key or memory location is is analogous to nobody sharing our subjective experience of consciousness.
So, the functional copy of me is not me - it does not have my identity. What I care about is the survival of my identity, not my functionality.
This all seems quite straightforward, but perhaps difficult to put into entirely empirical terms. This identity can't be independently measured, so perhaps it is metaphysical rather than scientific.
Does functionalism, by assuming the adequacy of purely empirical scientific evidence, throw out what is in fact a sound objection to murder-and-replacement?
Key: Complain about this post
- 1
- 2
Intelligence by lookup table
More Conversations for John Searle's 'Chinese Room' Argument
Write an Entry
"The Hitchhiker's Guide to the Galaxy is a wholly remarkable book. It has been compiled and recompiled many times and under many different editorships. It contains contributions from countless numbers of travellers and researchers."