Sunday, March 18, 2012
The heterogeneity of intentionality
The problem of intentionality itself decomposes further into two interrelated but distinguishable problems. The topic of this chapter is the problem of mental representation. Formal systems of representation such as languages have, supposedly, the property of meaning (that I will usually call the semantic property or, interchangeably, the intentional property). Symbols refer to, are about, things other than themselves (the neologism “aboutness” also expresses this property) while physical things (or things described and explained in physical terms) do not have any such property (physical explanations are “closed,” that is they include only physical terms). A naturalized semantic of psychological predicates would be free of reference to non-physical properties, but even our current neurophysiology textbooks tend to present information-processing models of nervous system function (and the popular conception of the mind is of something full of images, information and language).
The problem of meaning is also addressed in philosophy of language, but language and other symbol-systems are conventional (albeit the products of long evolutionary processes): the location of the ur-problem is in philosophy of mind. Consider the chair in which you sit. It (the chair) does not mean anything. Of course you can assign some arbitrary significance to it if you wish (“When I put the chair on the porch the coast is clear”), or infer things from its nature, disposition and so forth (“Who’s been sitting on my chair?”), but that doesn’t affect the point: physical objects in and of themselves don’t mean anything or refer to other things the way symbols do. Now consider your own physical body: it doesn’t “mean” anything any more than the chair or any other physical object does. Nor do its parts: your hand or, more to the point, your brain, or any parts of, or processes occurring in, your brain. Your brain is just neural tissue humming and buzzing and doing its electrochemical thing, and the only properties included in our descriptions and explanations of its workings are physical properties. But when we predicate of a person mental states such as “He believes that Paris is the capitol of France,” or “She hopes that Margaret is at the party tonight,” these mental states appear to have the property of referring to, of being about, something else: France or Margaret or what have you. It looks, that is, like the mental state has a property that the physical state utterly lacks: a non-physical property.
The operationalist theories of mind developed by philosophers in the early 20th century are largely a response to the problem of representation, although there are a variety of conclusions: behaviorism is straightforwardly eliminativist about mental content, limiting the possible criteria for use of psychological predicates to intersubjectively observable things (granting that there are “strong” and “soft” versions of the theory). Computationalism holds that minds are formal rule-governed symbol-manipulating systems. It aims at radically minimizing the symbol system (as in binary-code machine language for example) but is committed to symbolic content per se, as a computer is defined, a la Turing, as a formal rule-governed symbol manipulating device.
Functionalism proposes a psychology that is described purely in functional terms rather than physical terms. This allows for replacing references to representations with references to functionally equivalent, not-necessarily-representational states, but in its very abstraction functionalism does not commit to eliminating representations (functionalism may be more of a method than a theory). This chapter develops an operationalist semantic of intentional predicates that not only dispenses with any references to mental representation (as behaviorism and functionalism do) but that also develops an eliminativist account that actually rules out the possibility of mental content.
The other part of the problem of intentionality is the problem of rationality. Rationality is multiply realizable (a synonymous term is supervenient). For example a human being, a dolphin, a (theoretically possible) rational artifact and a (probably existing) rational extraterrestrial can all grasp and make use of the function of, say, transitivity (“If X then Y, if Y then Z, therefore if X then Z”). But these beings are made of various different substances organized in various different ways. There are, apparently, no physical states or properties that are necessary for all rational beings: no physical criteria that fix the extension of the set of all rational beings. There are no psychophysical laws regarding rationality, generalizations to the effect that any being with such-and-such logical capacity must have such-and-such physical characteristics or vice versa.
The problem of mental representation and the problem of rationality can be distinguished as separate metaphysical problems. We would still be confronted with the problem of rationality even if nobody subscribed to a representational theory of mind. Nonetheless the two sub-problems should be grouped together under the general rubric of the problem of intentionality because both are problems for the same set of psychological predicates, the intentional predicates: “believes,” “desires,” “hopes,” “fears” etc. Intentional predicates name states that apparently entail mental content, as one believes that X, fears that Y etc., and also apparently entail rationality, as it is only explanatory when I say to you about a person that he left the room “because he was thirsty” if we share the background assumption that if he believes that there is water at the fountain and desires to have water then, all other things being equal, he will form an intention to go to the fountain – that is, a person must show some minimal grasp of the logical relations between intentional states both to be a subject of intentional predication and to make use of it (this is commonly referred to as the rationality assumption).
I think that I can provide a satisfactory response to the problem of propositions as bearers of logical relations and to the problem of rationality generally, although the result is somewhat surprising in the context of the overall naturalist project of this book. However the problem of mental representation will be discussed first, because it is important to see that even if we were to reject the representational theory of mind (as I think we should) we would still be confronted with the problem of rationality.
Labels:
intentionality,
mental representation,
rationality
Subscribe to:
Posts (Atom)