the functional spiritof computation

22
1 The Functional Spirit of Computation Abdul Deeb UCLA 12/21/2015 Introduction Mental representation, how it functions, and the metaphysics of these mental states have long been a preoccupation of philosophers. Theories concerning the metaphysics of mind fall roughly into two opposing sides. Internalism of mental kinds is a thesis stating that an agent’s mental states supervene on internal physical states of the agent. In individuating these states there is no reason to make reference to the individual’s physical or social environments. Psychological states are narrowly individuated. Externalistic theories claim that the nature of these states depend, essentially, on their relation to their environment. Psychology has made significant progress in explaining visual perception through the utilization of computational theory. Marr’s theory of vision exemplifies this claim. The problems of mind become more explicit and the answers less straightforward when examining the mind as a computational system. Perception is an information-processing task that allows an organism to represent their environment. What it is to be a token perceptual state is to have representational content. Representational content has three explanatory functions. (i) It is a mode of representing. The particular mode used to represent helps to individuate the psychological state. (ii) It acts as a basic constituent of psychological and linguistic kinds. (iii) Lastly, it sets the conditions for which the psychological state can be evaluated for veridicality. If the content of the state is internalistic, it will not be individuated by reference to the subject’s environment. All that matters in individuating the state is the internal structure of the state. Externalistic theories of

Upload: ucla

Post on 21-Nov-2023

0 views

Category:

Documents


0 download

TRANSCRIPT

1

The Functional Spirit of Computation Abdul Deeb

UCLA12/21/2015

Introduction

Mental representation, how it functions, and the metaphysics of these mental states have long

been a preoccupation of philosophers. Theories concerning the metaphysics of mind fall roughly

into two opposing sides. Internalism of mental kinds is a thesis stating that an agent’s mental

states supervene on internal physical states of the agent. In individuating these states there is no

reason to make reference to the individual’s physical or social environments. Psychological

states are narrowly individuated. Externalistic theories claim that the nature of these states

depend, essentially, on their relation to their environment.

Psychology has made significant progress in explaining visual perception through the

utilization of computational theory. Marr’s theory of vision exemplifies this claim. The problems

of mind become more explicit and the answers less straightforward when examining the mind as

a computational system.

Perception is an information-processing task that allows an organism to represent their

environment. What it is to be a token perceptual state is to have representational content.

Representational content has three explanatory functions. (i) It is a mode of representing. The

particular mode used to represent helps to individuate the psychological state. (ii) It acts as a

basic constituent of psychological and linguistic kinds. (iii) Lastly, it sets the conditions for

which the psychological state can be evaluated for veridicality. If the content of the state is

internalistic, it will not be individuated by reference to the subject’s environment. All that

matters in individuating the state is the internal structure of the state. Externalistic theories of

2

perception hold that representational content is evaluated in regard to the environment. The

perceptual state, then, will depend on the environment of the organism having said state.

Frances Egan claims that computational states are individuated without essential

reference to their representational contents. They are individuated in regards to their

mathematical structure, which according to Egan, is narrow. If two computational states have the

same mathematical formalism, they are identical even if they differ in functions. I will argue that

there is no reason to favor Egan’s non-semantic individuation of computational states. I also

claim that the function of the state, what the state aims to do, is an individuation condition of that

state. Furthermore, I will argue that the nature of any adaptive biological state, necessarily,

depends on the environment. The reasons why the state was evolutionarily maintained are

essential in identifying the nature of the state. The reasons the state was maintained and adapted

for are straightforwardly externalistic.

I will, first, narrow my discussion of computation to Marr’s theory of vision. I will, then,

provide a general account of Egan’s picture of Marr’s theory of vision. In the following section, I

further specify a thought experiment Egan implements to argue for the narrow individuation of

computational states. I also show that the conclusions Egan gathers from her thought experiment

are unfounded. In section four, I argue for the role of function and adaptation in individuating

biological states.

1. Marr: The Three Levels, the Role of Representation, and the Problems Solved.

Explanations of computation are theory-laden. Attempting to provide a general account is, in my

opinion, a useless enterprise. There exist few properties that are common among the many

descriptions of the concept. To understand the nature of a computational process that some

3

particular theoretical framework invokes, the explanatory role that framework assigns to

computation must be addressed. This is to say, aspects of computation are theory specific.

Marr’s theory (1982) has three explanatory levels. These levels are used to explain the

perceptual system, but this methodology has become commonplace in theoretical cognitive

science in explaining other information-processing systems. The first level of theory, the

computational level, is the most abstract level. Here, the system is explained as a mapping from

one kind of information to another. This level sets a goal and specifies an appropriate strategy for

obtaining that goal. The goal of the computation is a particular output. The function of a

computational theory is to explain how the goal has been fulfilled. The strategy of the

computation sets constraints which bear on the total computational theory. The constraints are

facts pertaining to the conditions of the environment and assumptions based on these facts. In

explaining vision, Marr uses these constraints to explain how humans produce accurate

perceptions of the distal environment from only the proximal scene.1 The next level, the

algorithmic level, is a realization of the computational level. This level assigns representations as

inputs and outputs, then formulates a way in which the input is transformed into the output. The

task of the algorithmic level is, then, to formulate a way in which the system reliably generates

accurate representations of distal stimuli from the retinal images. The focus of this paper will be

on the first two levels, the level of computation and the algorithmic level. The last level, the

implementational level, specifies the physiological mechanism that physically carries out the

algorithm. This level deals with how those computational processes are actually physically

implemented. In humans the implementation level makes reference to the visual cortexes, the

1 To say that a perception is accurate is roughly to say that the state is (i) caused by what the state is a perception of, (ii) represents certain basic mind-independent features of reality, and (iii) resembles what the state represents.

4

eyes, and the optic nervous pathways. Though, it is feasible that the same computationally

described system be instantiated in a variety of physical systems.

The computational level is a function in extension, defining sets of possible inputs and

outputs. The algorithmic level is a function in intension, explaining the relationship between a

particular input-output pair.

Computational level: input → output

Algorithmic level: f (input) → output (input²) →output² (input³)...→output

The first level is used to answer ‘what’ and ‘why’ questions about the theory and the second

level used to answer ‘how’ questions concerning the process described in the first level.

The task of a theory of vision is to explain how our perceptual system operates. This

problem is specified in the first level of explanation. More specifically, the aim of Marr’s theory

is to explain (i) under what conditions accurate perceptions are formed, (ii) under what

conditions illusory perceptions are formed, and (iii) how sensory registrations of light arrays are

converted into representations as of a distal stimuli.

First, light arrays are reflected off of a distal stimulus. The proximal stimulus is converted

into a two-dimensional image. This conversion is done by the organism’s retina as a sensory

registration. This pattern of registration is the input that will, through computation, become a

representation as of the organism's environment. This maps out a skeleton of a causal theory of

vision.

However, the causal picture becomes less certain when considering that the data

contained in the proximal stimulus cannot uniquely specify a three-dimensional representation.

There are too many three-dimensional scenes with the same two-dimensional projections. For

example, a two-dimensional image of a square could be projected as three-dimensional cube with

5

a width of x, x+1, or x + n, where n is any natural number. Furthermore, the retinal image is

insufficient to uniquely specify the distal stimulus that produced the proximal stimulus. The first

underdetermination problem is concerned with the perceptual system, while the second is

concerned with the laws of optics.

To explain how an organism has veridical representations of distal stimuli, both

underdetermination problems need to be solved. In dealing with the second problem, the

mapping of retinal image to distal stimulus, Marr relies on environmental regularities. Appealing

to regularities of the visual world limits the varieties of distal stimuli that could produce the

proximal stimulus. There are roughly two kinds of regularities specified by Marr, (i) regularities

concerning the behavior of light and (ii) regularities concerning the behavior of the distal world.

(i) It is an empirical fact that visual properties, like luminance or depth, vary smoothly. Light

reflects off an object in straight lines and these rays of light are governed by laws of reflection

and refraction. So, for example, luminance measured at one cross section of a light array varies

in slight gradation from the measurements of nearby cross sections. The value measured at the

most proximal end of the array is explained by antecedent cross sections of the array. This,

therefore, limits the kind of distal stimuli that could give rise to a particular proximal stimulus.

Marr also has a set of regularities concerning properties of the distal world. (ii) Objects tend to

be cohesive, convex, and opaque. Most objects in the standard environment maintain their

constitution. Furthermore, when a two-dimensional image has a convex object and a concave

object, the human visual system favors the convex shape as a body and the concave shape as a

background. This betting process tends to be accurate as there aren't many objects that cave

inward. Lastly, most objects in the standard world are not made of transparent substances.

6

It is unclear what a representation is according to Marr’s theory of vision. Roughly, a

representation is a symbol.2 Representations make “explicit certain entities or types of

information”. (p. 20) Computational theories of cognition treat thought and perception as a kind

of calculation that involve linear manipulations of distinguishable symbols. These symbols can

be individuated by their semantic or syntactic properties.

“The structure of the real world...plays an important role in determining both the nature

of the representations that are used and the nature of the processes that derive and

maintain them. An important part of the theoretical analysis is to make explicit the

physical constraints and assumptions that have been used in the design of the

representations and processes.”(p. 43)

However, what kind of role the environment plays in determining the nature of representations

and computations is unclear. Frances Egan claims that the computational process is an

environmentally independent characterization of a device. Marr’s theory of vision individuates

computational states “formally, and so independently of the environment in which the visual

system is normally deployed, and to which it is adapted” (1999). Tyler Burge and Christopher

Peacocke disagree. Burge (1987) argues that Marr’s theory is an essentially intentional

enterprise. Peacocke (1995) argues that purely formal characterization of computation does not

allow for psychology to answer the intentional questions it aims to. In the next section I will

explain two viewpoints concerning the nature of mental states.

2. Egan: Computation as ‘Environmentally Neutral’

Frances Egan (1999), claims that computational characterizations of cognitive capacities are

“neutral” in regards to the external environment they operate in. The fact that a device is

2 I understand ‘symbol’, in this sense, as identical to a vehicle. The symbol is not itself a representation in the strict psychological sense: a state with conditions for accuracy. The nature of a vehicle or symbol is not independent from the domain it represents.

7

adaptive to its environment is a contingent property of the device. Computational

characterizations of cognitive capacities, at their core, are purely formal or syntactic.3 Therefore,

semantics or content is not an essential aspect of a computational theory or a psychological

process. Egan explains a formal computational state as one that does not constitutively depend

on its particular representational content. What is important in individuating computational states

is the syntax of the algorithmic level. In this section I will detail Egan’s argument.

A theory explaining a psychological process, say vision, must make reference to the

perceiving organisms’ environment for the computational level of explanation. The first level

explains the problems that the organism in its environment needs to solve. The organism's

environment is the environment which it is adapted to.4 In vision the computational level

explains the underdetermination problem with direct reference to distal stimuli. This is, of

course, non-confrontational. However, Egan goes on to say that the total “computational

characterization is itself ‘environment neutral’.” (p. 181)

The Marrian visual system is assumed to be adapted to its environment by means of

evolutionary pressures. There is an organism O and an environment E, where O’s visual system

represents features in its environment E by virtue of its perceptual system having been evolved to

accurately represent E. This is a background assumption of Marr’s theory. I will not be providing

any kind of robust defense of this. Prima facie, there is a pull toward thinking that O representing

a distal property P has something to do with instances of P being in E.

3 In this paper I will be using the words ‘syntactic’ and ‘mathematical’ interchangeably. Egan seems to prefer the term ‘mathematical’, however I understand her to be referring to non-semantic properties of mathematical structures. 4 By “adapted” I’m not implying an organism adapting to an ecological niche. The environment is on the scale of regularities in the physical world like convex, opaque objects and regular laws of optics. The distinction is not trivial. Organisms of this universe, with our distal stimuli and laws of optics, have adapted to these regularities. In some hypothetical universe, where the regularities differ, one would expect that the organisms of that universe would have adapted to those regularities.

8

Egan proposes a twin-earth thought experiment. The goal of this experiment is to

distinguish the nature of the device from the environmental constraints and assumptions of

Marr’s theory. In this section I will give a rough outline of the thought experiment as it appears

in Egan’s work, to motivate her claims. In the following section I will provide a full account of

what the experiment entails.

Organism O represents a property P in its natural environment E. There is a

computational characterization of this psychological process with an algorithm specifying the

transformation of a retinal image to a mathematical output. The output of the computational state

is interpreted as an accurate representation of P by means of an interpretation function which

“specifies a mapping between equivalence classes of physical states of the system and elements

of some represented domain.” Suppose that there is an organism O' who is in an environment

that is radically different from the environment of O belongs to. Call this environment E'. There

are no instances of P in E’. However, by mutation or spandrel, O' and O have states with the

same algorithm specifying the transformation of a retinal image to a mathematical output.

According to Egan, the semantic characterizations of the computational state of O is different

from the characterization of O'. The output of the algorithm belonging to O', though the same as

the output of O, cannot be interpreted by the same interpretation function. This is because E and

E' are different.

The perceptual system of O makes reference to a particular attribute, P, in its

environment on the bases of causally antecedent retinal images. The system is representing

accurately. The system of O', however, is not representing P. There are no instances of P in E'. It

is unclear what could be interpreted from the computation of O'. Egan concludes that it is certain

that “the device would still compute a well-defined mathematical function, specified by the

9

computational theory.”(p. 180) As result of O and O’ having the same algorithmically described

transformations, the computational characterization of their devices are the same. Thus,

computational characterizations are invariant across possible worlds.

This thought experiment intends to show that a computational device tracking visible

distal properties is nonessential to the characterization of the computation. The fact that the

semantic content of the counterfactual differs from the original organism’s content bears no

significance on the computational characterization. Therefore, computation is a non-semantic

process. It’s not the case that computational states lack content. Rather, their content is not

essential in the individuation of the computational state. Therefore computational states are

narrowly individuated. However, the contents of intentional states are externalistic.5 For

example, the attribution of distance to an object in an organism's visual field relies on that

organism's environment. These wide contents are attributed to computational process by means

of an interpretation function. Egan claims that semantic contents do not individuate the

computational state. What individuates a computational state is the algorithmic level. The

computational state could be the same in a world where the output of the state does not map to

some distal stimulus.

One concern that arises from these conclusions is that the computational level of Marr’s

theory is explicitly formulated in non-internalistic terms. The aim of this level is to show how the

problem of arriving at certain veridical representations is solved. It looks as if represented

content plays an individuating role in Marr’s theory of computation. Egan does not ignore this.

She claims that the purpose of non-internalistic characterization of computational processes is

explanatory. It is useful in understanding the algorithmic level, in that it makes obvious what is

5 ‘Intentional’ states are about or refer to things. They have a directedness in that they point toward some target. These states have semantic properties, and some can be evaluated in regards to veridicality.

10

going on at this level. The computational level specifies the mathematical function computed by

the device, and what matters in explaining the device is this finite mathematical procedure. Just

because part of the theory is explained in semantic terms doesn't mean that process explained by

the theory is essentially intentional. It could be the case that a device has a well-defined

algorithmic function though does not produce a representational state.

3. Thought Experiment Revisited

In this section I will further examine Egan’s thought experiment. Egan’s thought experiment

involves a counterfactual world where the Marrian regularities do not hold. The experiment

never specifies which regularities, or if all of Marr’s assumed environmental regularities, are not

present. I will detail the experiment twice. First, in regards to regularities concerning the laws of

optics. Then, I will run the experiment where the counterfactual world has objects that are

irregular.

Imagine that the laws of optics do not hold true. Transitions between different cross

sections of a given light array do not vary smoothly. The measurements at one cross section of a

light array does not vary in slight gradation from the measurements of nearby cross sections.

However, the proximal stimulus happens to be the same in, both, the actual world and the

counterfactual. The distal stimulus is however different in both worlds. A further stipulation is

that organism O' and O are chemically identical. Therefore, the process of sensory registration of

the proximal stimulus is the same in both O' and O. Lastly, the computational state of O' lacks

an evolutionary history. The state of the organism in the counterfactual environment arose

through mutation. By stipulating these conditions, Egan severs the relation between the distal

stimuli and the computational output. All we have left to examine is the computational state

attributed to the organism. “Two mechanisms that compute the same mathematical function,

11

using the same algorithm, are, from a computational point of view, the same mechanism, even

though they may be deployed in different environments.” (p. 180) Egan, then, claims this is

grounds to think that computational states are individuated non-semantically. Furthermore, the

states are individuated narrowly.

As consequence of this experiment the environment has no bearing on the computational

state of O'. This is true only if the three stipulations hold true. I will grant the positing of non-

keplerian optics.6 However, also claiming that the organisms are physically identical is possibly

problematic. It is not logically impossible that the two organisms be type-identical. It is,

however, implausible that the creatures remain type-identical while the laws of optics differ.7

Granting that the proximal stimulus is the same in both worlds, it’s hard to imagine that the laws

of optics would not affect the physical makeup, and thus the implementation level, of O'.8 Visible

light is a kind of electromagnetic radiation. The laws that govern light also govern other

electromagnetic interactions. One such interaction is electric charge. Charge is carried by

electrons and protons which are subject to mechanical laws. The laws of optics are intertwined

with the laws that govern charge. Every organism's nervous system operates through the

transference of electrical inputs. The pattern of registration produced by an organism's retina

from proximal stimulations is mediated by laws of optics. Furthermore, all other brain states of

the organism depend on these laws. If the laws of optics are different one would expect that

either O' compensates for this difference or O' has radically different or no brain states. Whether

6 Though, it is possible there may be issues with this stipulation alone, I will not address my concerns in this paper. 7I’m have in mind Fermat’s principle as the law that does not hold in Egan’s counterfactual. His principle of least time states that that out of all possible paths that light might use to get from point A to point B, light takes the path which requires the shortest time. The speed of light is constant. In this twin-universe we are imaging that Fermat’s law does not hold. Light arrays take weird paths and take more time to get from one point to another. To my knowledge she never specified which law of optics she wanted to change. However, I don’t think it matters. It is likely that the changing of any one of these laws would have dramatic effects on atomic structures. 8 It is stated by Marr that the same computational state can be realized in different hardwares. However, it is not clear that O' could even have a physical system capable of computation.

12

or not the same type of processes could be carried out by a system under a non-rectilinear light

propagation scheme is not a simple question to answer. If it turns out that the organisms are not

type-identical then no conclusions can be gathered from this experiment. I have serious doubts

about the validity of the conjunction of these two stipulations, and I believe that the thought

experiment, so construed, is a non-starter.9 It might be more useful to formulate the thought

experiment in regards to the second kind of environmental regularity.

There is a world where the majority of objects are concave. The surfaces of objects curve

inward rather than bulge outward. The system of O' is computationally identical to the system of

O. The two organisms are also molecularly identical. The laws of optics are the same in both

worlds. In the counterfactual environment light reflected off of a particular concave distal object

is the causal antecedent of a particular retinal image belonging to O'. Both O and O' will compute

the type-identical retinal image in the same way. The outputs of the computational states are the

same. The content attributed to the state of O by means of an interpretation function will be

veridical. If the same interpretation function is utilized on the output of the state of O', the

content attributed to the state of O' will be false, the distal objects in E' are not convex. The two

organisms have the same computationally characterized states because they employ the same

algorithm for their transformations.

Concerning O', Egan says “we cannot say what they would represent in the counterfactual

world; perhaps they would represent only features of the retinal image.” (p. 180)10 This then

prompts the question “what is gained from this thought experiment”. We have this organism O'

that has a state that is computationally identical to a state that organism O has. According to

9 It also may be the case, generally, that laws of nature should not be used as variables in twin-earth thought experiments.10 It is unclear what is meant by “represent” by Egan here. By ‘representation’, I mean a state that has as a constitutive aspect conditions for veridicality.

13

Egan, because the two states are computationally identical and the state of O' lacks content we

can infer that the computational state of O has content contingently.

This is only true if what individuates a computational state is the algorithmic level.

Consider another organism O². O² is a conspecific of O and is in environment E and is

biologically identical to O.11 O² registers the same proximal stimulus as O and O'. I stipulate that

the same pattern of registration is formed by both organisms. O² and O have type-identical

representational states. At the computational level of Marr's theory, O² and O are identical. It

need not be the case that the two organisms have the same algorithm that governs the

transformation from input to output. The computational level underdetermines the algorithmic

level in the following sense. The function in extension specified by the computational level can

be realized in the algorithmic level in myriad ways. Therefore, O and O² can have the type-same

representational state while the token algorithmic descriptions of the states vary. (Peacocke 1999

p.201)

The purpose of a computational theory of vision is to explain how we come to have the

visual representational states we have. In one sense O and O² have the same computationally

described state and Egan posits that, in another sense, O and O' do as well. According to Egan,

because O and O' employ the same algorithm but have different mental contents, the content

must be contingent to the computational state.12 With O and O², it is because both organisms

have the same content that they have the same computationally described state.

To claim a quality is an individuation condition of a phenomena is to claim that the

quality is necessary for token object x to be of type y. A necessary condition of x being a cat is

that it be a mammal. According to Egan, the nature of a computational state is individuated by a 11 This entails that both creatures have the same evolutionary history. The two should be understood as perfectly identical twins with the same parents.12 It is also a possibility that rather than the two organisms having different content, O has mental content and O' does not. It is, however, not the interest of this paper to explore this distinction.

14

mathematical formalism of the state which transforms the initial state of the device to a

mathematically characterized output. As previously stated, the computational function, which

specifies what information is extracted from which antecedent resources, can be realized

algorithmically in many ways. If it is necessary that the algorithm of O and O² match for O² to

have the type-same computational state as O, then they cannot have the same state. The two

organisms do, however, share the same state. Therefore, a computational state’s mathematical

function cannot serve as an individuation condition for a computational state.

All that can inferred from Egan’s thought experiment is that the computational state of O

can be studied narrowly. This claim, however, does not need Egan’s thought experiment to be

true. Scientists often study phenomena in isolation. A geneticist might study a specific genetic

codon common to many organisms in isolation from a particular organism’s total genetic

structure, internal environment or how the sequence was adapted for. That a phenomenon can be

studied narrowly is a methodological claim that I will assume is non-controversial.13 This is not a

claim about individuation or the nature of a computational state.

There is a difference between saying that a phenomena can be studied narrowly and

claiming that a phenomena is individuated narrowly. There is also, possibly, a difference in the

claims that a phenomena is individuated purely on syntax and claiming that a phenomena is

individuated narrowly. I will explore this later distinctions further in the following section.

4. Evolution, Function, and Syntax

13 Of course, whether or not a particular methodology should be employed is grounds for debate. Though, If Egan is correct to characterize computational states as syntactically driven, then a narrow study could help to answer the “how-questions” specified by the computational level of the theory and realized in the algorithmic level. How an organism transforms an input into an output which satisfies the computational level of the theory is explained by the algorithmic level. This level formulates a way in which the input is transformed into the output.

15

Egan claims that computational states are only type-individuated by their “mathematical

functions.” The visual system operates on a mathematical input by means of a mathematical

formalism, producing a mathematical output. The interpretation of the output, by means of an

interpretation function, ascribes representational content. What these states represent play no

individuative or causal role. The content ascribed by an interpretation function is explanatorily

useful though contingent qua computational theory. Reducing computational states to

mathematical formalisms is, according to Egan (2003), the only “way to square

computationalism with intentional psychology”. (p. 99)

According to Egan, if two states have the same mathematical characterization, though

different functions and evolutionary histories, they are type-identical states. Following this, all

that is necessary to explain a computational state is the algorithmic level of the computational

theory. Therefore, the only thing of importance in the individuation of a token computational

state is explaining how the state operates. Questions like “why does the perceptual system

accurately represent a property P?” or “why does the perceptual system produce illusionary

perceptions under certain conditions?” are of no individuative importance. In fact, Egan

completely ignores the function of the state in her construal of computational states. “To describe

the device as making a mistake is to impose our own interests and expectations on it.” (p.101) A

functional description of a device is nothing more than a normative characterization of the

device’s behavior. However, I have not seen from Egan any strong reasons to reject functions as

an individuation condition. In fact, I see no reason to reject what I take to be the more natural

view that the function of the state is individuative to the state.

The state is responsible for procuring an accurate mind to world relation by means of

transforming a non-representational retinal image into an accurate representation of some distal

16

stimuli. It would seem that the nature of a computational state depends, in part, on what the state

does in context of the total visual system. An analogy Tyler Burge (2003) provides succinctly

illustrates this point.

“What it is to be a heart depends essentially on what the heart does in the context of the

body. It pumps blood to other parts of the body. A chemically identical object that did not

pump blood would not be a heart. Imagine that it had evolved to carry out an entirely

different function in an organism entirely different from any animals with hearts.

Similarly, parts’ being certain valves and ventricles in the heart depend on their being

parts of a heart, which in turn depends on the heart’s functional and causal relations to the

environment. In this sense what it is to be a heart depends on relations between the heart

and things outside the heart.” (p. 454)

What should be gathered from this analogy is that the causal relation the computational state has

to the system and the total organism are part of the individuation conditions of the state. If a state

has the same syntax as a computational visual state but has a different function or is embedded in

a different performance system, it is not the same type of state.14 If this is the case, it may be a

mistake to say that O' even has a computational state at all. The computational state of O' does

not procure an accurate mind to world relation. If the state functions to do anything at all, it

differs from what the state of O functions to do.

4.1. Adaptations and Narrowness

Egan claims that a computational state is individuated on the basis of the states syntax and

therefore computational states are narrowly individuated. However, it is not clear that any aspect

of a computational state is narrow. The computational state of O is what allows O to transform a

14 Any one state or system has multiple functions. The functions have a hierarchical order, with a master function and subordinate functions. For example, the perceptual system functions to produce accurate representations of the subject’s world. This is a subordinate function to the system aiming to increase the biological fitness of the organism with that system.

17

retinal image into an accurate representation. The state is the result of a natural selection favoring

this state over some alternative. Ignoring cases like O', where the computational states came

about through random mutation, the perceptual system is the product of a long evolutionary

history.

An aim of Marr’s theory of vision is to explain why we come to have the

representational states we do. Evolution plays an important explanatory role in answering this

question. The perceptual system of O adapted over time due to evolutionary pressures on the

biological ancestors of O. Why O has the system it has, and why many of its computational

states engender veridical perceptual content, is because the system positively affects the

biological fitness of O. Not surprisingly, the capacity to accurately represent the distal

environment will allow the organism to survive and reproduce, and thus, pass on their visual

system to their offspring. The fact that the perceptual system is adaptive is clearly not an

internalistic fact about the device. This is a relational fact.15

The biological sciences treat analogous traits as distinct from traits that have identical

histories. Though fruit bats and skylarks both have wings, their traits were not inherited from

some common ancestor of the two. If two states have the same function though different

evolutionary histories, they are only superficially the same state. According to Egan (1999) “a

computational theory, in providing a formal characterization of a device, abstracts away from the

device’s historical properties.” (p. 181) I agree with Egan. The same computational system can

15 Egan agrees with this, though states that a device being adaptive to its environment is a contingent property of the device. The reason Egan claims that adaptation is contingent is because evolutionary pressures are not responsible for the device of O'. The organism’s visual system is a mutation. However, I don’t see how this is an argument for adaption being non-essential. As result of this non-adaptive visual system, O' will likely die before passing on its genes. Therefore, the visual system fails in its biological function. Any biological system functions to increase the fitness of the organism. The cardiovascular system pumps blood throughout a body to keep the organism alive and well so that they might pass on their genes and, thus, fulfills its biological function. Constitutive explanations of the nature of biological kinds should fix on the successful realizations of their biological function. It makes little sense to explain the nature of an adaptive state by appealing to a non-adaptive state.

18

be in different hardwares and thus be realized in organisms with radically different histories.

However, evolutionary history and function are distinct. Two states can be functionally identical

while being distinct in regards to their history. To say that a state is adaptive is to say that there is

some function that was useful to the survival of the organism and the organism's ancestors. The

importance of an adaptation is not the genealogy of the state but the problems solved. According

to Egan, because the evolutionary history of the device is trivial qua computational theory the

environment plays no role in the characterization of the device. Adaptive problem solving is

about the environment (including the organism's body) and is impossible to characterize without

making direct reference to the environment.

A constitutive explanation of a computational state belonging to an organism with an

evolutionary history, that successfully interacts with its environment must reference why the

organism successfully interacts with its environment. At any level of the theory the organism's

adaptive problem solving can be referenced. At the implementation level it is quite

straightforward. The creature has such physical features as to better avoid predators or to

efficiently catch prey. The reason a hawk has it ocular anatomy and specialized visual cortexes is

that this configuration of physical structures is the relatively best suited for spotting prey from

long distances away. The same adaptive problem solving is responsible for the abstract mental

aspects of the visual system. The creature’s perceptual system performs certain transformations

on the retinal image as to accurately represent the organism's environment. The perceptual

success of the organism allows it to better navigate the distal environment, spot predators, and

prey. Therefore, the perceptual function, to represent accurately, and the biological function of

the perceptual system are intimately connected in an organism that is adapted to its environment.

Veridical representations help an organism to navigate their environment and a perceptual

19

system fulfilling its function increases the organism's biological fitness. The organism

successfully represents its environment as result of evolutionary pressures on the organism's

ancestors.

For an organism like O, the mathematical formalism is adaptive. The content attributed

to the state of O by means of an interpretation function will be veridical. As a result of this, O

will live long enough to reproduce. For a state to be adaptive the state must be maintained and

evolved by means of natural selection. An adaptive state of an organism relies on the ancestors

of that organism and how that state relates to its ancestors environment. If the state did not

increase the biological fitness of the ancestor, the ancestor would have a lower chance of survival

and reproduction. Therefore, the well-defined mathematical formalism of the state is a product of

the organism's environment and is widely individuated.

Conclusion

I close with some remarks aimed at clarifying, both, my rejection of Egan’s position and the

view of mental states I argued for throughout this paper. I will also connect my functional

constraints on computational states with a different theory of computation.

Egan argues that computational states do not constitutively depend on the output of the

state, representational content. What is important in individuating computational states is the

internal structure of the state. Computational states are only individuated by their mathematical

functions. If two computational states are the same in regards to their mathematical formalism,

they are type-identical states. The evolutionary history, content, and function of a computational

state is non-essential to the state being that state.

In this paper I argued that the computational function, which specifies an input-output

pair, can be realized algorithmically in many ways. The claim that an algorithm employed in a

20

computation is a constitutive feature of a perceptual process is unreasonable. The same function

in extension can be computed in many different ways, while maintaining the same input-output

pair. Therefore, the well-defined mathematical formalism of a state cannot serve as an

individuation condition of the state.

Furthermore, it would seem that the function of the state, what the state is about, is

essential in characterizing the state. Functional descriptions of a device are not normative

characterizations of the device’s behavior. They explain the causal relation the computational

state has to the system and the total organism. The relation the state has to the total system is

individuative to the state. Two states with different functions will have different naturalistic

explanations.

Lastly, I argued for the wide individuation of internal structures of computational states

by way of addressing the importance of adaptation. The goals of psychology include explanation

of accurate representational mental states. The reason as to why an organism has accurate

representational mental states is straightforwardly externalist. The organism adapted to better

represent their environment in order to survive in their environment. A well-defined

mathematical formalism in a biologically fit organism is adaptive.

Egan claims that her picture of computation is the only way to reasonably combine

computationalism with intentional psychology. I do not agree. There are alternative theories of

computation which do not betray the aims of psychology. Christopher Peacocke (1994) proposes

a content-involving account of computation. Peacocke attempts to provide a model of

computation more suitable for resolving the problem of how we come to have semantic,

representational states in processes like vision or language apprehension. Roughly, the model is

as follows: (i) the input to the algorithm lacks content but the output has widely individuated

21

content, and (ii) there is an algorithm which specifies how content- involving properties are

explained by non-content involving properties. This model, unlike Egan’s, places the importance

of computational theory on the problems solved by the perceptual system. Peacocke’s content-

involving computation is completely compatible with my arguments in this paper. The

importance of function in the characterization of computational states is supported by this model

and my arguments here further support the content-involving account of computation.

References

Burge, T. 1986: Individualism and Psychology. Philosophical Review, 95, 3–45.Burge, T. 2003: Reflections and Replies: Essays on the Philosophy of

Tyler Burge. MIT Press, 451-68Burge, T. 2010: Origins of Objectivity. Oxford University Press. Chalmers, D. 2011: A Computational Foundation for the Study of Cognition. Journal of

Cognitive Science 12, 323-57Cummins, R. 1989: Meaning and Mental Representation. Cambridge, MA: MIT Press.Egan, F. 1992: Individualism, Computation, and Perceptual Content. Mind, 101,

443–59.Egan, F. 1995: Computation and Content. Philosophical Review, 104, 181–203.Egan, F. 1999: In Defence of Narrow Mindedness. Mind and Language

14, 177-94.Egan, F. 2010: Naturalistic Inquiry: Where Does Mental Representation Fit In?, Chomsky and

His Critics, Blackwell Pub, 89-103Marr, D. 1982: Vision. New York: Freeman.Peacocke, C. 1994: Content, Computation, and Externalism. Mind and Language

9, 303–35.Peacocke, C. 1999: Computation as Involving Content: A Response to Egan. Mind and

Language 14, 195–202.Rescorla, M. 2015: The Representational Foundations of Computation.

22