Searle: Is the Brain’s Mind a Computer Program?
 

We can distinguish between three ways of ascribing intentional states to things:

We sometimes ascribe intentional states to computers and other man-made devices, e.g., when we say that our chess playing computer wants to take the bishop. But our chess playing computer probably does not have any original intentionality because it's too simple a device; at most, it has only a derivative intentionality depending on that of its programmers.
The functionalist, however, believes that if we have a computer running a sophisticated enough program, then the computer will have its own original intentional states just in virtue of running those programs. This is the view Searle wants to argue against.

1. Two types of AI:

  1. Strong AI: A machine can think just in virtue of implementing a computer program because the program itself is constitutive of thinking.  In  particular, a program passing the Turing test is a mind.

  2. NOTE: this is a view to which functionalism is committed.
  3. Weak AI: computer models of minds are useful tools, but just as a model  of the weather is not the weather, so one of the mind is not the mind. Simulation is not duplication.


2. Attack on strong AI

A. The Chinese Room Analogy.  Note that in the setup, the manipulation of symbols is totally syntactical.
Some objections and replies to the Chinese Room analogy:

B. The formal argument.
  1. computer programs are formal (syntactic); as such:
  2. human minds have mental contents (semantics).
  3. syntax by itself is neither constitutive nor sufficient for semantics.

  4. NOTE: This is taken as a logical truth because the same syntactical system admits of infinite semantics.
  5. Hence, programs are neither constitutive nor sufficient for minds.
  6. So, strong AI is false.

  7. NOTES:


3.  Searle's positive view

Searle is not a substance dualist.  He believes that brains cause minds, but minds are not things different from brains; rather, minds are causally emergent properties of neural activity (no mentality in single neurons).  Hence, any system capable of causing minds must duplicate the specific causal powers of the brain, and this cannot be done merely by running a formal program.  At times he seems to toy with epiphenomenalism, and yet he rejects that view.
 

4. Two connected reasons for resistance to Chinese room:

  1. behaviorist residue in Turing test: belief that psychology must restrict  itself to external behavior and that therefore simulation entails duplication.
  2. dualist residue: strong AI allows one to deny the mind is as biological as digestion.


5. A development: Searle now believes that strong AI is not wrong, but incoherent.   There is no syntax without a subject considering a physical system as a  syntactical system.  Hence, a computer without a subject considering it a syntactical system is but a physical system.  As “semantics not reducible to syntax, so syntax is not reducible to physical system”.