The Systems Reply
While the person in the room may not understand Chinese, the system as a whole (including the person, the rule book, the symbols, and the room itself) does. The output of the room, when viewed as a single entity, demonstrates understanding.
Searle’s Reply: By simplifying this list of physical objects: he asks what happens if the man memorises the rules and keeps track of everything in his head? Then the whole system consists of just one object: the man himself. Searle argues that if the man doesn’t understand Chinese then the system doesn’t understand Chinese either because now “the system” and “the man” both describe exactly the same object.
The Robot Reply
A fixed computer as confined to the room does not understand Chinese under present considerations. If it had sensors to perceive the world and motors to interact with it, the robot would then have genuine understanding (and understand Chinese).
Searle’s Reply: Adding a robotic body with sensory inputs and motor outputs doesn’t provide the system with understanding. It doesn’t acknowledge genuine mental states, and only guarantees more complex symbol manipulation. In other words, a physical layer simply adds another layer of formal symbol processing.
The Brain Simulator Reply
If the computer’s program simulated the actual neural firings of a Chinese speaker’s brain, then the computer would understand Chinese, just as a native speaker does
Searle’s Reply: Searle replies that such a simulation does not reproduce the important features of the brain—its causal and intentional states. Searle is adamant that “human mental phenomena [are] dependent on actual physical–chemical properties of actual human brains.”
The Other Minds Reply
One is justified in attributing intelligence to an individual on the basis of purely behavioural criteria.
Searle’s Reply: Such a mind is, at best, a simulation, and writes: “No one supposes that a computer simulation of a rainstorm will leave us all drenched.” Others argue “When we call up the pocket calculator function on a desktop computer, we don’t complain that ‘it isn’t really a calculator’, because the physical attributes of the device do not matter.” So is the human mind like the pocket calculator, essentially composed of information? Or is the mind like the rainstorm, something other than a computer, and not realisable in full by a computer simulation?