The Ontology of Creature Consciousness: A Challenge for Philosophy[1]

 

(Commentary on “Consciousness without a Cerebral Cortex: A Challenge for Neuroscience and Medicine,” by Björn Merker)

 

Gualtiero Piccinini

Department of Philosophy

University of MissouriSt. Louis
St. Louis, MO 63121-4400 USA

piccininig@umsl.edu

www.umsl.edu/~piccininig/

 

This is a preprint of an article whose final and definitive form will be published in Behavioral and Brain Sciences, 30.1; Behavioral and Brain Sciences is available online at: http://www.bbsonline.org/.  Copyright by Cambridge University Press.

 

Abstract:  I appeal to Merker’s theory to motivate a hypothesis about the ontology of consciousness:  creature consciousness is (at least partially) constitutive of phenomenal consciousness.  Rather than elaborating theories of phenomenal consciousness couched solely in terms of state consciousness, as philosophers are fond of doing, a correct approach to phenomenal consciousness should begin with an account of creature consciousness.

 

 

A traditional consciousness question is whether preverbal children have phenomenal experiences, and if they do, what convinces us that they do.  In this context, congenitally decorticate children are not even worth discussing.  Yet Merker argues that (some) children with hydranencephaly have phenomenal experiences.  He backs up his claim with an elaborate theory supported by a wide range of evidence.  To make sense of his theory, we might need to think about the ontology of consciousness in a new way.

 

When philosophers attempt to spell out what consciousness is, they typically formulate the problem in terms of so-called state consciousness:  What does it take for a mental state to be an experience?  Their most worked-out answers employ two kinds of ingredient:  functional and representational.  Their least worked-out answers appeal to some condition to be discovered empirically by scientists.  For instance, pain might be C-fiber firing, or whatever scientists tell us.  Well, Merker is a scientist, and he is telling us something.

 

Merker tells us that “primary consciousness” has the function of integrating sensory information and motivations to select targets and actions.  He adds that primary consciousness is constituted by the structure of the “analog reality simulator” that fulfills this function.  This may sound like a hybrid functional-representational theory.

 

But Merker’s theory does not say what it takes for a mental state to be conscious.  It is not even formulated in terms of mental states.  Furthermore, Merker attributes consciousness to some congenitally decerebrate children.  How plausible is it that such children have experiences as we do?  If we keep framing the question of consciousness in traditional terms—in terms of what it takes for mental states to be phenomenally conscious—we seem to face a dilemma:  either decorticate children have the same kind of conscious states that we have, and hence have phenomenal consciousness, or they don’t, and hence have no phenomenal consciousness.  Either way, Merker has not told us what it takes to have such states.  We can dismiss his theory as misguided and pursue our ontological inquiry as before.

 

Alternatively, we can take Merker’s theory seriously and see where it leads us.  Merker says his subject matter is “the state or condition presupposed by any experience whatsoever,” or the “‘medium’ of any and all possible experience.”  He then gives us a detailed account of such a medium, couched in terms of neural systems, their functions, and their interrelations.

 

Insofar as philosophers talk about anything that sounds like this, it is what they sometimes call creature consciousness.  For present purposes and to a first approximation, creature consciousness is whatever differentiates ordinary people who are either awake or in REM sleep from ordinary people who are in non-REM sleep, in a coma, etc.  This seems to be what Merker is theorizing about.

 

When it comes to understanding phenomenal consciousness, many philosophers would maintain that creature consciousness is mostly irrelevant to the ontology of phenomenal consciousness.  According to the philosophical mainstream, the ontological key to phenomenal consciousness resides in state consciousness. 

 

Merker, though, says his subject matter is consciousness in its most “basic” sense.  Perhaps he is onto something.  Perhaps creature consciousness is at least partially constitutive of phenomenal consciousness.  What would this mean?  Most people agree that creature consciousness is a necessary condition for state consciousness.  Perhaps there is more to creature consciousness than that.

 

From the point of view of neuroscience, creature consciousness is a global state of (part of) the brain—the difference between ordinary people’s brain when they are awake or in REM sleep and their brain when they are in non-REM sleep, in a coma, etc.   My suggestion is that creature consciousness thus understood contains at least part of the ontological basis of phenomenal consciousness.  In other words, a (more or less large) part of what makes a system have experiences is that it is creature-conscious.

 

Under this view, state consciousness may be understood as follows:  a state is state-conscious if and only if it is the state of (a spatiotemporal part of) a creature-conscious brain, or better, an appropriate kind of state of (a spatiotemporal part of) a creature-conscious brain.  There remain, of course, two important questions:  First, what is the difference between those states of creature-conscious beings that are phenomenally conscious and those that are not?  Second, what else is needed (if anything), besides creature consciousness, for full-blown phenomenal consciousness?  An adequate theory of consciousness would have to answer these questions.

 

What kind of global brain state corresponds to creature-consciousness?  Is it physical, functional, representational, or a combination of these?  According to Merker, creature consciousness is the product of an analog reality simulator that integrates sensations and motivations to select targets and actions.  Perhaps his view could be glossed as follows:  when the simulator is operating, the system is creature-conscious; when the simulator is idle (for whatever reason: rest, breakdown, etc.), the system is creature-unconscious.  Integrating sensory information and motivations as well as selecting targets and actions appear to be broadly functional and representational notions.  So Merker appears to be offering a functional/representational account of creature consciousness.

 

There is at least one other option.  Perhaps creature consciousness requires some special physical properties, analogously to the way water’s power to dissolve certain substances and not others requires a certain molecular composition and molecular structure at a certain temperature (cf. Shapiro 2004).  I cannot elaborate further.  Differentiating clearly between physical, functional, and representational accounts of creature consciousness would require an adequate account of the distinction between the physical, the functional, and the representational, and there is no room for that here.

 

The present suggestion has epistemological consequences.  If creature consciousness were at least partially constitutive of phenomenal consciousness, it would be a mistake to develop theories couched solely in terms of state consciousness, without saying anything about creature consciousness—as philosophers are fond of doing.  Rather, a correct approach to phenomenal consciousness should begin with an account of creature consciousness.

 

Before concluding, it may be helpful to distinguish several different claims:  (1) the brainstem is necessary to sustain and regulate creature consciousness (uncontroversial), (2) the brainstem can sustain creature consciousness by itself (Merker’s theory), (3) the brainstem can be the locus of conscious experience (Merker’s theory), and (4) creature consciousness is (at least part of) the ontological basis of conscious experience.

 

Thesis (3) is stronger than (2), and Merker does little to support (3) as opposed to (2).  (Do children with hydranencephaly go into anything resembling REM sleep?  Evidence that they do would support (3).)  Perhaps he intends to make a further claim:  (5) creature consciousness is sufficient for phenomenal consciousness.  Thesis (5) is even stronger than (4).  In light of unconscious cognition, including phenomena such as blindsight, (5) is hard to swallow without at least some qualification.

 

But we don’t need to accept all of Merker’s claims in order to consider (4).  In fact, claim (4) can be motivated on the grounds of (2) or even (1) alone, and (1) is uncontroversial.  If phenomenal consciousness can occur without a cortex, as Merker believes, then the challenge posed by (4) becomes more forceful and more difficult to avoid.  But regardless of the extent to which we agree to Merker’s theory, we should consider the possibility that (4) is correct.

 

Shapiro, L. A. (2004). The Mind Incarnate. Cambridge, MA, MIT Press.



[1] Thanks to Brit Brogaard, Bob Gordon, Pete Mandik, Brendan Ritchie, and Anna-Mari Rusanen for discussion and comments.