Hostname: page-component-586b7cd67f-2plfb Total loading time: 0 Render date: 2024-11-27T12:15:08.708Z Has data issue: false hasContentIssue false

Homing in on consciousness in the nervous system: An action-based synthesis

Published online by Cambridge University Press:  22 June 2015

Ezequiel Morsella
Affiliation:
Department of Psychology, San Francisco State University, San Francisco, CA 94132-4168; Department of Neurology, University of California, San Francisco, San Francisco, CA 94158. [email protected]://online.sfsu.edu/morsella/people.html
Christine A. Godwin
Affiliation:
School of Psychology, Georgia Institute of Technology, Atlanta, GA 30318. [email protected]://control.gatech.edu/people/graduate/cgodwin/
Tiffany K. Jantz
Affiliation:
Department of Psychology, University of Michigan, Ann Arbor, MI 48109-1043. [email protected]://prod.lsa.umich.edu/psych/people/graduate-students/tkjantz.html
Stephen C. Krieger
Affiliation:
Department of Neurology, Mount Sinai Medical Center, New York, NY 10029-6574. [email protected]://www.mountsinai.org/profiles/stephen-krieger
Adam Gazzaley
Affiliation:
Department of Neurology, Department of Psychiatry, and Department of Physiology, University of California, San Francisco, San Francisco, CA, 94158. [email protected]://gazzaleylab.ucsf.edu/people-profiles/adam-gazzaley/
Rights & Permissions [Opens in a new window]

Abstract

What is the primary function of consciousness in the nervous system? The answer to this question remains enigmatic, not so much because of a lack of relevant data, but because of the lack of a conceptual framework with which to interpret the data. To this end, we have developed Passive Frame Theory, an internally coherent framework that, from an action-based perspective, synthesizes empirically supported hypotheses from diverse fields of investigation. The theory proposes that the primary function of consciousness is well-circumscribed, serving the somatic nervous system. For this system, consciousness serves as a frame that constrains and directs skeletal muscle output, thereby yielding adaptive behavior. The mechanism by which consciousness achieves this is more counterintuitive, passive, and “low level” than the kinds of functions that theorists have previously attributed to consciousness. Passive frame theory begins to illuminate (a) what consciousness contributes to nervous function, (b) how consciousness achieves this function, and (c) the neuroanatomical substrates of conscious processes. Our untraditional, action-based perspective focuses on olfaction instead of on vision and is descriptive (describing the products of nature as they evolved to be) rather than normative (construing processes in terms of how they should function). Passive frame theory begins to isolate the neuroanatomical, cognitive-mechanistic, and representational (e.g., conscious contents) processes associated with consciousness.

Type
Target Article
Copyright
Copyright © Cambridge University Press 2016 

What does consciousness contribute to the functioning of the nervous system? What is the primary role of this elusive phenomenon? The answers to these questions remain enigmatic, not so much because of a lack of relevant data, but because of the lack of a conceptual, internally coherent framework with which to interpret the data (Grossberg Reference Grossberg1987). Hence, we developed Passive Frame Theory, a framework that, from an action-based perspective, synthesizes empirically supported hypotheses from diverse fields of investigation. The framework begins to illuminate (a) what consciousness contributes to nervous function, (b) how consciousness achieves this function, and (c) the neuroanatomical substrates of conscious processes. Passive frame theory proposes that the primary function of consciousness is well circumscribed, serving the somatic nervous system. For this system, it serves as a frame that constrains and directs skeletal muscle output, thereby yielding adaptive behavior. The mechanism by which consciousness achieves this is more counterintuitive, passive, and “low level” than the kinds of functions that theorists have attributed to consciousness.

Our unique perspective and conclusions provide a comprehensive approach to the enigma of the primary function of consciousness. To solve this puzzle, an overarching coherent framework is a necessary first step to the development of more concrete advances (e.g., hypotheses for the derivation of experiment-specific predictions). Before discussing the hypotheses that serve as the tenets of passive frame theory, it is necessary to define some terms and describe the nature of our untraditional approach.

1. Purview, terms, and assumptions

1.1. The scientific approach should focus on the most basic form of consciousness

We believe that, to advance the study of consciousness, one should focus not on high forms of consciousness (e.g., “self-consciousness”), but on the most basic forms of consciousness (e.g., the experience of a smell, visual afterimages, tooth pain, or urges to scratch an itch). This form of consciousness has fallen under the rubrics of “sentience” (Pinker Reference Pinker1997), “primary consciousness” (Edelman Reference Edelman1989), “phenomenal consciousness” (Block Reference Block1995b), “qualia” (J. A. Gray Reference Gray2004), “phenomenal states” (Tye Reference Tye1999), and “subjective experience.” In our framework, we refer to a thing of which one is conscious (e.g., an afterimage) as a conscious content (Merker Reference Merker2007; Seth Reference Seth2007). All of the contents of which one is conscious at one time can be construed as composing the conscious field (Freeman Reference Freeman2004; Köhler Reference Köhler1947; Searle Reference Searle2000). The contents of the conscious field change over time.

1.2. The approach should be descriptive, non-normative

We believe that the approach to consciousness should be a descriptive, naturalistically based one (which describes the products of nature as they evolved to be) rather than a normative one (which construes processes in terms of how they should function). Nervous mechanisms have been fashioned by the happenstance and tinkering process of evolution, whose products can be counterintuitive and suboptimal (de Waal Reference de Waal2002; Gould Reference Gould1977; Lorenz Reference Lorenz1963; Marcus Reference Marcus2008; Roe & Simpson Reference Roe and Simpson1958; Simpson Reference Simpson1949), far unlike the kinds of things humans design into machines (Arkin Reference Arkin1998). Hence, the ethologist Konrad Lorenz (Reference Lorenz1963) cautions, “To the biologist who knows the ways in which selection works and who is also aware of its limitations it is in no way surprising to find, in its constructions, some details which are unnecessary or even detrimental to survival” (p. 260). Similarly, when speaking about the reverse engineering of biological products, the roboticist Ronald Arkin concludes, “Biological systems bring a large amount of evolutionary baggage unnecessary to support intelligent behavior in their silicon based counterparts” (Arkin Reference Arkin1998, p. 32). The difference between the products of evolution and human artifacts is obvious when one considers the stark contrast between human locomotion and artificial locomotion – legs versus wheels (Morsella & Poehlman Reference Morsella and Poehlman2013).

When adopting a descriptive standpoint, even the most cursory examination of the brain reveals a contrast between conscious and unconscious processes (see Bleuler Reference Bleuler and Brill1924). Thus, in every field of inquiry, there is the de facto distinction between the two kinds of processes, though often without mention of the taboo term “consciousness.” For example, in perception research, there exists the distinction between supra- versus subliminal. In memory research, there is the distinction between “declarative” (explicit) processes and “procedural” (implicit) processes (Schacter Reference Schacter1996; Squire Reference Squire1987). In motor and language research, the conscious aspects of voluntary action or of speech production are contrasted with the unconscious aspects of, say, motor programming (Levelt Reference Levelt1989; Rosenbaum Reference Rosenbaum, Yantis and Yantis2002; J. A. Taylor & Ivry Reference Taylor, Ivry, Prinz, Beisert and Herwig2013). Various fields also contrast “controlled” processing (which tends to be conscious) and “automatic” processing (which is often unconscious; Lieberman Reference Lieberman, Harmon-Jones and Winkielman2007). In summary, from a descriptive approach, the contrast between conscious and unconscious processes in the brain is somewhat inevitable (Morsella & Poehlman Reference Morsella and Poehlman2013).

1.3. The approach should be minimalistic, focusing on simple cases

When attempting to unravel a phenomenon as perplexing as consciousness, it is important to adopt a strategy in which scientific inquiry begins with the examination of the most basic, elemental instantiation of the phenomenon of interest (Panksepp Reference Panksepp2007). Such a strategy proved fruitful in the development of physics (Einstein & Infeld Reference Einstein and Infeld1938/1967). Hence, in our approach, we focus on the actions of a hypothetical, simplified, human-like mammal that, though conscious (for a treatment of consciousness in mammals, see J. A. Gray [2004]), is not equipped with many of the complicated abilities/states associated with adult humans (e.g., syntax and music appreciation). Capable of having approach-avoidance conflicts (Lewin Reference Lewin1935; N. E. Miller Reference Miller and Koch1959), this hypothetical organism is occupied only with basic operations (e.g., breathing, locomoting, and avoiding tissue damage) rather than with higher-level phenomena (e.g., mental rotation and sustained, directed thinking). This organism is also incapable of indirect cognitive control (Morsella et al. Reference Morsella, Lanska, Berger and Gazzaley2009b), whereby one can, for instance, make oneself hungry or scared by deliberately imagining the kinds of things that would induce these states. Such higher-level phenomena are more likely to be predicated upon (a) extensive learning, (b) cultural influences, (c) intricate interactions among more elemental conscious processes, and (d) adaptations that are less phylogenetically primitive than those of the basic operations of interest (Morsella Reference Morsella2005).

In our “simple case,” this hypothetical organism is resting in a warm enclosure (e.g., a cave). It can consciously perceive an opening from which it could exit. For hours, the organism performs no notable locomotive act toward the opening nor to anything else, but then it perceives a noxious smell (e.g., smoke) from within the enclosure. Because of this new conscious content, it now exits hesitantly through the opening, even though it was inclined to remain within the warm enclosure. To illuminate the nature of consciousness, we will revisit this “creature in the cave” scenario throughout our treatise. We refer to the first events composing the scenario (e.g., the percept of the opening and the warmth) as Stage 1, and the subsequent events (e.g., the smell and the inclination to stay in the cave) as Stage 2.

In contrast to our strategy, descriptive accounts of consciousness have tended to focus on high-level functions, leading to conclusions such as that the function of consciousness pertains to social interaction (Frith Reference Frith2010; Prinz Reference Prinz2012), language (Banks Reference Banks1995; Carlson Reference Carlson1994; Macphail Reference Macphail1998), “theory of mind” (Stuss & Anderson Reference Stuss and Anderson2004), the formation of the self (Greenwald & Pratkanis Reference Greenwald, Pratkanis, Wyer and Srull1984), semantic processing (Kouider & Dupoux Reference Kouider and Dupoux2004; Mudrik et al. Reference Mudrik, Faivre and Koch2014; Thagard & Stewart Reference Thagard and Stewart2014), the meaningful interpretation of situations (Roser & Gazzaniga Reference Roser and Gazzaniga2004), and simulations of behavior and perception (Hesslow Reference Hesslow2002). (It is worth noting that, for good reasons, it has also been proposed that, contrary to the present account, consciousness does not contribute to ongoing action; see Hommel Reference Hommel2013; Jackson Reference Jackson1986; Kinsbourne Reference Kinsbourne, Cohen and Schooler1996; Reference Kinsbourne2000; Masicampo & Baumeister Reference Masicampo and Baumeister2013; Pinker Reference Pinker1997.)

1.4. Overview of present, untraditional approach

Our approach is untraditional in several ways. First, instead of focusing on the relationship between consciousness and perception (which has been the dominant approach; Crick & Koch Reference Crick and Koch2003; Rosenbaum Reference Rosenbaum2005), we focus on the relationship between consciousness and overt action. Second, unlike traditional stimulusresponse approaches, we “work backward” from overt action to the underlying processes responsible for it (Sperry Reference Sperry1952). Thus, from our untraditional, action-based approach, we subscribe to an uncommon theoretical position – that the nature of consciousness is best understood by examining the requirements of adaptive (efferent) action control rather than the needs of perceptual analysis. From this unconventional approach to consciousness, one can appreciate that the requirements of adaptive skeletomotor action reveal much about the nature of both the conscious field and the generation of conscious contents. Third, building on Morsella and Bargh (Reference Morsella and Bargh2007), instead of focusing on vision to understand consciousness (which has been the prevalent approach; Crick & Koch Reference Crick and Koch2003), we focus on the (possibly) more tractable system of olfaction, as illustrated in our “creature in the cave” example. The olfactory system possesses several features that render it a fruitful system in which to study consciousness.Footnote 1

To summarize, our approach is elemental, action-based, simple, and evolutionary-based (or, for short, “EASE,” meaning “to make something less difficult”). We believe that an EASE perspective provides the most fruitful approach to the perplexing problem of consciousness and the brain. Whenever in our enterprise we encountered an obstacle for theoretical progress (e.g., the neural regions associated with consciousness), it was through our EASE perspective that progress was made. In the next three sections, we discuss from an EASE perspective the empirically supported hypotheses that serve as the tenets of passive frame theory. Through the process, we begin to isolate the neuroanatomical, cognitive-mechanistic, and representational (e.g., conscious contents; sect. 3) processes associated with consciousness.

2. The circumscribed role of consciousness in the nervous system

2.1. Tenet: Consciousness is associated with only a subset of nervous function

Based on developments of the past four decades, there is a growing subset consensus – that consciousness is associated with only a subset of all of the processes and regions of the nervous systemFootnote 2 (Aru et al. Reference Aru, Bachmann, Singer and Melloni2012; Crick & Koch Reference Crick and Koch1995; Reference Crick and Koch2003; Dehaene & Naccache Reference Dehaene and Naccache2001; J. A. Gray Reference Gray2004; Grossberg Reference Grossberg1999; Koch Reference Koch2004; Koch & Greenfield Reference Koch and Greenfield2007; Logothetis & Schall Reference Logothetis and Schall1989; Merker Reference Merker2007; Reference Merker2013c; Penfield & Jasper Reference Penfield and Jasper1954; Weiskrantz Reference Weiskrantz1992; Zeki & Bartels Reference Zeki and Bartels1999). This subset seems to be qualitatively distinct – in terms of its functioning, physical makeup/organization, or mode of activity – from that of its unconscious counterparts in the brain (Bleuler Reference Bleuler and Brill1924; Coenen Reference Coenen1998; Edelman & Tononi Reference Edelman and Tononi2000; Goodale & Milner Reference Goodale and Milner2004; J. A. Gray Reference Gray2004; Llinás et al. Reference Llinás, Ribary, Contreras and Pedroarena1998; Merker Reference Merker2007; Ojemann Reference Ojemann1986).

Consistent with the subset consensus, many aspects of nervous function are unconscious.Footnote 3 Complex processes of an unconscious nature can be found at all stages of processing (Velmans Reference Velmans1991), including low-level perceptual analysis (e.g., motion detection, color detection, auditory analysis; Zeki & Bartels Reference Zeki and Bartels1999), semantic-conceptual processing (Harley Reference Harley1993; Lucas Reference Lucas2000), and motor programming (discussed in sect. 3.1). Evidence for the complexity of unconscious processing is found in cases in which the entire stimulus-response arc is mediated unconsciously, as in the case of unconsciously mediated actions (e.g., automatisms). There is a plethora of evidence that action plans can be activated, selected, and even expressed unconsciously.Footnote 4 In summary, it seems that much in the nervous system is achieved unconsciously. This insight from the subset consensus leads one to the following question: What does consciousness contribute to nervous function?

2.2. Tenet: The conscious field serves an integrative role

The integration consensus (Baars Reference Baars1988; Reference Baars1998; Reference Baars2002; Reference Baars2005; Boly et al. Reference Boly, Garrido, Gosseries, Bruno, Boveroux, Schnakers, Massimini, Litvak, Laureys and Friston2011; Clark Reference Clark2002; Damasio Reference Damasio1989; Dehaene & Naccache Reference Dehaene and Naccache2001; Del Cul et al. Reference Del Cul, Baillet and Dehaene2007; Doesburg et al. Reference Doesburg, Green, McDonald and Ward2009; Edelman & Tononi Reference Edelman and Tononi2000; Freeman Reference Freeman1991; Koch Reference Koch2012; Kriegel Reference Kriegel2007; Llinás & Ribary Reference Llinás and Ribary2001; Merker Reference Merker2007; Ortinski & Meador Reference Ortinski and Meador2004; Sergent & Dehaene Reference Sergent and Dahaene2004; Srinivasan et al. Reference Srinivasan, Russell, Edelman and Tononi1999; Tallon-Baudry Reference Tallon-Baudry2012; Tononi Reference Tononi2012; Tononi & Edelman Reference Tononi and Edelman1998; Uhlhaas et al. Reference Uhlhaas, Pipa, Lima, Melloni, Neuenschwander, Nikolic and Singer2009; Varela et al. Reference Varela, Lachaux, Rodriguez and Martinerie2001; Zeki & Bartels Reference Zeki and Bartels1999) proposes that consciousness integrates neural activities and information-processing structures that would otherwise be independent. Most of the hypotheses comprising this consensus speak of conscious information as being available “globally,” in some kind of workspace, as in Baars's (Reference Baars1988) influential global workspace theory. For present purposes, we construe the contents occupying such a workspace as composing the conscious field (defined in sect. 1.1 above).

Consistent with the integration consensus, the conscious processing of a percept involves a wider and more diverse network of regions than does the subliminal (unconscious) processing of the same percept (Singer Reference Singer, Laureys and Tononi2011; Uhlhaas et al. Reference Uhlhaas, Pipa, Lima, Melloni, Neuenschwander, Nikolic and Singer2009). The latter is subjected only to “local” processing. This evidence stemmed initially from research on perception (Del Cul et al. Reference Del Cul, Baillet and Dehaene2007; Uhlhaas et al. Reference Uhlhaas, Pipa, Lima, Melloni, Neuenschwander, Nikolic and Singer2009), anesthesia (Alkire et al. Reference Alkire, Hudetz and Tononi2008; Boveroux et al. Reference Boveroux, Vanhaudenhuyse, Bruno, Noirhomme, Lauwick, Luxen, Degueldre, Plenevaux, Schnakers, Phillips, Brichant, Bonhomme, Maquet, Greicius, Laureys and Boly2010; Långsjö et al. Reference Långsjö, Alkire, Kaskinoro, Hayama, Maksimow, Kaisti, Aalto, Aantaa, Jääskeläinen, Revonsuo and Scheinin2012; Lee et al. Reference Lee, Kim, Noh, Choi, Hwang and Mashour2009; Lewis et al. Reference Lewis, Weiner, Mukamel, Donoghue, Eskandar, Madsen, Anderson, Hochberg, Cash, Brown and Purdon2012; Schroter et al. Reference Schroter, Spoormaker, Schorer, Wohlschlager, Czish, Kochs, Zimmer, Hemmer, Schneider, Jordan and Ilg2012; Schrouff et al. Reference Schrouff, Perlbarg, Boly, Marrelec, Boveroux, Vanhaudenhuyse, Bruno, Laureys, Phillips, Pélégrini-Isaac, Maquet and Benali2011), and unresponsive states (e.g., coma or vegetative state; Laureys Reference Laureys2005). Regarding perception research, it has been proposed that, during binocular rivalry,Footnote 5 the neural processing of the conscious percept requires special interactions between both perceptual regions and other, traditionally non-perceptual regions (e.g., frontal cortex; Doesburg et al. Reference Doesburg, Green, McDonald and Ward2009). This supports the view that some mode of interaction between widespread brain areas is important for consciousness (Buzsáki Reference Buzsáki2006; Doesburg et al. Reference Doesburg, Green, McDonald and Ward2009; Fries Reference Fries2005; Hummel & Gerloff Reference Hummel and Gerloff2005).

Evidence for the integration consensus is found also in action-based research. Conscious actions involve more widespread activations in the brain than do similar but unconscious actions (Kern et al. Reference Kern, Jaradeh, Arndorfer and Shaker2001; McKay et al. Reference McKay, Evans, Frackowiak and Corfield2003; Ortinski & Meador Reference Ortinski and Meador2004). Moreover, when actions are decoupled from consciousness (e.g., in neurological disorders), the actions often appear impulsive or inappropriate, as if they are not adequately influenced by the kinds of information by which they should be influenced (Morsella & Bargh Reference Morsella, Bargh, Cacioppo and Decety2011).

2.3. Advances regarding the physiological processes engendering consciousness depend on advances regarding the neuroanatomy of consciousness

The nature of the neuroanatomical network engendering the physiological processes (e.g., neural oscillations) proposed to be associated with consciousness remains controversial.Footnote 6 Progress regarding the neurophysiology of consciousness depends on advances regarding the identification of the neuroanatomical substrates of this state (Aru et al. Reference Aru, Bachmann, Singer and Melloni2012). Regarding neuroanatomy, when attempting to isolate the anatomical underpinnings of consciousness, investigators have followed Crick and Koch's (Reference Crick and Koch2003) recommendation and have focused on vision. (See reviews of neural correlates of visual consciousness in Blake and Logothetis [Reference Blake and Logothetis2002], Dehaene [2014], Koch [2004], Lamme and Spekreijse [2000], Metzinger [Reference Metzinger2000], and Tong [2003].) In vision research, controversy remains regarding whether consciousness depends on higher-order perceptual regions (Crick & Koch Reference Crick and Koch1995; Reference Crick and Koch1998; Panagiotaropoulos et al. 2012; Reference Panagiotaropoulos, Kapoor and Logothetis2013) or lower-order regions (Aru et al. Reference Aru, Bachmann, Singer and Melloni2012; Damasio, Reference Damasio2010; Friedman-Hill et al. Reference Friedman-Hill, Robertson and Treisman1995; Lamme Reference Lamme2001; Liu et al. Reference Liu, Paradis, Yahia-Cherif and Tallon-Baudry2012; Robertson Reference Robertson2003; Tallon-Baudry Reference Tallon-Baudry2012; Tong Reference Tong2003). Moreover, as noted in Note 2, whether cortical matter is necessary for consciousness remains controversial.

Theorists focusing on vision have proposed that, although the cortex may elaborate the contents of consciousness, consciousness is primarily a function of subcortical structures (Merker Reference Merker2007; Penfield & Jasper Reference Penfield and Jasper1954; Ward Reference Ward2011). Penfield and Jasper (Reference Penfield and Jasper1954) based this hypothesis on their studies involving both the direct stimulation of, and ablation of, cortical regions. Based on these and other findings (e.g., observations of patients with anencephaly; Merker Reference Merker2007), it has been proposed that consciousness is associated with subcortical areas (e.g., Merker Reference Merker2007; Reference Merker2013c). This has led to the cortical-subcortical controversy (Morsella et al. Reference Morsella, Berger and Krieger2011). While data from studies on patients with profound disorders of consciousness (e.g., vegetative state) suggest that signals from the frontal cortex may be critical for the instantiation of any form of consciousness (Boly et al. Reference Boly, Garrido, Gosseries, Bruno, Boveroux, Schnakers, Massimini, Litvak, Laureys and Friston2011; Dehaene & Naccache Reference Dehaene and Naccache2001; Lau Reference Lau2008; Panagiotaropoulos et al. 2012; Velly et al. Reference Velly, Rey, Bruder, Gouvitsos, Witjas, Regis, Peragut and Gouin2007), research on the psychophysiology of dream consciousness, which involves prefrontal deactivations (Muzur et al. Reference Muzur, Pace-Schott and Hobson2002), suggests that, although the prefrontal lobes are involved in cognitive control, they may not be essential for the generation of basic consciousness (Aru et al. Reference Aru, Bachmann, Singer and Melloni2012; Merker Reference Merker2007; Ward Reference Ward2011). Regarding the necessity of the integrity of the frontal lobes for consciousness, it is important to consider that the surgical procedure of frontal lobotomy, once a common neurosurgical intervention for the treatment of psychiatric disorders, was never reported to render patients incapable of sustaining consciousness (see also Aleman & Merker Reference Aleman and Merker2014).

The role of subcortical structures in the production of consciousness, and the amount of cortex that may be necessary for the production of consciousness, remains to be elucidated (see further discussion in sect. 3.5). Clearly, more investigation is needed regarding the neural correlates of consciousness, because controversy continues to surround not only the neurophysiological processes underlying consciousness, but even the identification of the gross, neuroanatomical regions that are responsible for this peculiar form of processing (see treatment in Merker Reference Merker2007; Reference Merker2013b).

Faced with this challenge, we propose that, because of the intimate liaison between function and structure in the nervous system (Cohen & Dennett Reference Cohen and Dennett2011; Merker Reference Merker2013c), progress can be made regarding the neural underpinnings of consciousness by having a more precise understanding of the role of consciousness in nervous function (Lamme & Spekreijse Reference Lamme and Spekreijse2000). With this in mind, one can reason as follows: If the consensus is that consciousness serves an integrative role, then, from an EASE perspective, what is the most basic form of integration that requires consciousness? Addressing this question allows one to better isolate consciousness within the nervous system, which could, in turn, resolve controversies regarding the neural correlates of consciousness.

2.4. Tenet: The conscious field is for a specific kind of integration, involving the skeletal muscle output system

One limitation of the integration consensus is that it fails to specify which kinds of integrations require consciousness and which kinds do not. Consciousness seems unnecessary for various kinds of integrations in the nervous system. For example, integrations across different sensory modalities, as in the case of afference binding (Morsella & Bargh Reference Morsella, Bargh, Cacioppo and Decety2011), can occur unconsciously. This form of integration occurs in feature binding (e.g., the binding of shape to color; Zeki & Bartels Reference Zeki and Bartels1999) and in intersensory binding (Vroomen & de Gelder Reference Vroomen and de Gelder2003), as in the ventriloquism and McGurk effects (McGurk & MacDonald Reference McGurk and MacDonald1976). The latter, for instance, involves interactions between visual and auditory processes: An observer views a speaker mouthing “ga” while presented with the sound “ba.” Surprisingly, the observer is unaware of any intersensory interaction, perceiving only “da.” (See list of many kinds of unconscious afference binding in Morsella [2005], Appendix A.) Integrations involving smooth muscle effectors (e.g., in peristalsis or in the pupillary reflex), too, can occur unconsciously (Morsella et al. Reference Morsella, Gray, Krieger and Bargh2009a), as can another form of integration known as efference binding (Haggard et al. 2002).

Efference binding links perceptual processing to action/motor processing. This kind of stimulus-response binding is mediated unconsciously in actions such as reflexive pain withdrawal or reflexive inhalation. In learned behavior, efference binding allows one to press a button when presented with an arbitrary cue. Such a form of binding can be learned quickly (e.g., from a few trials of stimulus-response mapping; Hommel & Elsner Reference Hommel, Elsner, Morsella, Bargh and Gollwitzer2009) and with little effort (Cohen-Kdoshay & Meiran Reference Cohen-Kdoshay and Meiran2009; Melcher et al. Reference Melcher, Weidema, Eenshuistra, Hommel and Gruber2008). Learned forms of efference binding can be expressed unconsciously (Fehrer & Biederman Reference Fehrer and Biederman1962; Fehrer & Raab Reference Fehrer and Raab1962; Hallett Reference Hallett2007; J. L. Taylor & McCloskey Reference Taylor and McCloskey1990; Reference Taylor and McCloskey1996). For example, subjects can select the correct motor response (one of two button presses) when confronted with subliminal stimuli, suggesting that “appropriate programs for two separate movements can be simultaneously held ready for use, and that either one can be executed when triggered by specific stimuli without subjective awareness” (Taylor & McCloskey Reference Taylor and McCloskey1996, p. 62; see review in Hallett Reference Hallett2007). We return to the topic of efference binding when discussing how conscious contents influence action (sect. 3.2).

In contrast to these unconscious forms of integration, people tend to be very much aware of some integrations, as when one holds one's breath while underwater or experiences an approach-avoidance conflict (Lewin Reference Lewin1935; N. E. Miller Reference Miller and Koch1959). In the former, one experiences the inclinations to both inhale and to not inhale. Similarly, when carrying a hot dish of food, one experiences the inclinations to drop the dish and to not drop the dish (Morsella Reference Morsella2005). Unlike unconscious integrations, such conscious conflicts (Morsella Reference Morsella2005) reflect a form of integration that is associated not with perceptual processing, but rather with action selection.Footnote 7 This form of integration has been distinguished from unconscious integrations/conflicts, such as the McGurk effect and smooth muscle conflicts (e.g., in the pupillary reflex). In short, conflicts at the stage of processing of action selection are experienced consciously, whereas conflicts at perceptual stages of processing are unconscious. It has been proposed that, unlike unconscious integrations, these integrations involve competition for control of the skeletal muscle (“skeletomotor,” for short) output system (Morsella Reference Morsella2005). The skeletomotor output system contains the unconscious motor plans that are necessary to enact one skeletomotor act versus another (Bizzi & Mussa-Ivaldi Reference Bizzi, Mussa-Ivaldi and Gazzaniga2004; Rizzolatti et al. Reference Rizzolatti, Fogassi, Gallese and Gazzaniga2004; Rosenbaum Reference Rosenbaum, Yantis and Yantis2002). It stores, for example, the unconscious articulatory plans that are necessary for speech production (Buchsbaum Reference Buchsbaum2013) and the plans for blinking (Graziano Reference Graziano2008). When these plans are stimulated sufficiently, overt actions arise.

Involving urges and other action-related inclinations, conscious conflicts occur when two streams of efference binding are trying to influence skeletomotor action simultaneously (Morsella & Bargh Reference Morsella, Bargh, Cacioppo and Decety2011). For example, conscious conflicts occur when one holds one's breath, suppresses uttering something, suppresses a prepotent response in a response interference paradigm, or voluntarily breathes faster for some reward. (The last example illustrates that not all cases of this kind of integration involve suppression.) These conscious conflicts appear to be triggered into existence by the activation of incompatible skeletomotor plans.Footnote 8 In our “creature in the cave” scenario, this form of integration occurs when the organism is inclined to both exit the enclosure (because of the smoke) and remain within it (because of the warmth).

Thus, Morsella (Reference Morsella2005) proposes that the primary function of consciousness is to integrate information, but only certain kinds of information – the kinds involving incompatible skeletal muscle intentions for adaptive action (e.g., holding one's breath while underwater).Footnote 9 From this standpoint, the conscious field is unnecessary to integrate perceptual-level processes (as in feature binding or intersensory conflicts), smooth muscle processes (e.g., pupillary reflex; Morsella et al. Reference Morsella, Gray, Krieger and Bargh2009a), or processes associated with motor control (discussed in sect. 3.1 below). Instead, the conscious field is necessary to integrate what appear to be multiple inclinations toward the skeletomotor output system, as captured by the principle of P arallel Responses into Skeletal Muscle (PRISM; Morsella Reference Morsella2005). From this perspective, and as fleshed out in the next section, it is this third kind of binding that is the most basic form of integration that requires consciousness. PRISM explains why, phenomenologically, a wink is different from a reflexive blink and from the dilation of a pupil.

2.5. Tenet: The conscious field is for adaptive voluntary action

In colloquial terms, one can conclude that consciousness is for adaptive “voluntary” action. Scientifically, consciousness can be construed as the medium that allows action processes to influence skeletomotor action collectively, leading to integrated actions (Morsella & Bargh Reference Morsella, Bargh, Cacioppo and Decety2011), such as holding one's breath. Just as a prism combines different colors to yield a single hue, the conscious field permits for multiple response tendencies to yield a single, integrated action. Absent consciousness, skeletomotor behavior can be influenced by only one of the efference streams, leading to unintegrated actions (Morsella & Bargh Reference Morsella, Bargh, Cacioppo and Decety2011), such as unconsciously inhaling while underwater or reflexively removing one's hand from a hot object. Reflecting a lack of integration, unintegrated actions appear as if they are not influenced by all of the kinds of information by which they should be influenced. If a conscious content is not in the field, then it cannot influence voluntary action. For example, if the knowledge representations necessary for, say, “reality monitoring,” are not in the field (e.g., due to fever), then nothing else can assume the functional influence of these contents. (This is evident in action selection in dreams, which are often irrational, and in disorders of awareness, such as sensory neglect and anosognosia.) Therefore, in voluntary action, when the appropriate contents are absent, there is no independent system or repository of knowledge that can step in to fill their role. Thus, the conscious field wholly and exclusively determines what in everyday life is called voluntary behavior. Conversely, for every voluntary action, the organism can report a conscious content responsible for that action, regardless of the veracity of the introspection (Poehlman et al. Reference Poehlman, Jantz and Morsella2012).

These conclusions also reveal that it is no accident that, historically, skeletal muscle has been described as “voluntary” muscle. Since at least the nineteenth century, it has been known that, though often functioning unconsciously (as in the frequent actions of breathing and blinking), skeletal muscle is the only bodily effector that can be consciously controlled, but why this is so has never been addressed theoretically. PRISM introduces a systematic reinterpretation of this age-old fact (Morsella Reference Morsella2005): Skeletomotor actions are at times “consciously mediated” because they are directed by multiple systems that require consciousness to influence action collectively – what we refer to as collective influence.

Regarding the skeletomotor output system, one must consider that all processes trying to influence skeletomotor behavior must, in a sense, “go through it.” Each system giving rise to inclinations has its peculiar operating principles and phylogenetic origins (Allman Reference Allman2000): One system “protests” an exploratory act while another system reinforces that act (Morsella Reference Morsella2005). Because each skeletomotor effector can usually perform only one act at a time (e.g., one can utter only one word at a time; Lashley Reference Lashley and Jeffress1951; Wundt 1900), there must be a way in which the inclinations from the many heterogeneous systems can be “understood” and processed collectively by the skeletomotor output system. To yield adaptive action, this process must also integrate information about other things (e.g., the physical environment). To a degree greater than that of any other effector system (e.g., smooth muscle), distinct regions/systems of the brain are trying to control the skeletomotor output system in different and often opposing ways. All inclinations toward it, from primitive plans about basic needs to complex plans associated with language, must engage this system. Thus, the skeletomotor output system is the “final common path” for processes capable of influencing skeletomotor function (McFarland & Sibly Reference McFarland and Sibly1975; Sherrington Reference Sherrington1906). Figuratively speaking, the skeletomotor output system is akin to a single steering wheel that is controlled by multiple drivers (Morsella Reference Morsella2005).

3. Conscious contents

If one accepts that consciousness is in the service of voluntary action, then, from an EASE perspective and by working backward from overt action to central processing, one can ask the following question: What kinds of information and knowledge representation (Markman Reference Markman1999) render voluntary action adaptive? To answer this question, one must examine the nature of conscious contents while appreciating that the varied inputs to the skeletomotor output system must (a) represent information that is essential for adaptive skeletomotor action and (b) be formatted in a manner that is understandable by the unconscious, action-related mechanisms composing the skeletomotor output system (Morsella & Bargh Reference Morsella and Bargh2010b).

The three tenets presented in sections 3.1 through 3.3 below suggest that our “creature in the cave” is conscious of things such as external objects and the urge to eat or to affiliate (or to do other things that reflect the inclinations of the many “hot” affective/incentive systems; Metcalfe & Mischel Reference Metcalfe and Mischel1999), because these things should influence the skeletomotor output system. For this creature, it is clear that additions Footnote 10 of conscious content are usually about the world, the body, or action-related inclinations (Brentano Reference Brentano1874; Chomsky Reference Chomsky1988; Fodor Reference Fodor1980; Reference Fodor1998; J. A. Gray Reference Gray1995; Reference Gray2004; Hume Reference Hume and Selby-Bigge1739/1888; Koch Reference Koch2004; Schopenhauer Reference Schopenhauer1818/1819; Sergent & Dehaene Reference Sergent and Dahaene2004).

3.1. Tenet: Conscious contents must be “perceptual-like” in nature

We propose that the cognitive and neural processes associated with the contents of our “creature in the cave” should be perceptual-like in nature. When making this claim, we acknowledge that conscious contents are neither purely sensorial nor purely motor related; instead, they represent well-crafted representations occurring at a stage of processing between sensory analysis and motor programming (Jackendoff Reference Jackendoff1990; Lashley Reference Lashley1956; Merker Reference Merker2013c; J. Prinz Reference Prinz, Velmans and Schneider2007; W. Prinz Reference Prinz, Maasen, Prinz and Roth2003b). In everyday life, when speaking about this level of representation of external objects, we use the term percept (J. A. Gray Reference Gray1995); however, this level of representation is more precisely construed as an intermediate representational format (e.g., the color red or the illusion of “da” in the McGurk effect) that links perception to action (W. Prinz Reference Prinz, Maasen, Prinz and Roth2003b). (To not introduce more jargon, we will continue to use the term percept to refer to conscious contents about the external world or the body, but we do so mindful that the term, because of its sensory connotation, can be misleading.)

The proposal that contents are perceptual-like is based on the synthesis of conclusions from diverse areas of study. First, according to the age-old sensorium hypothesis (Godwin et al. Reference Godwin, Gazzaley, Morsella, Pereira and Lehmann2013; Goodale & Milner Reference Goodale and Milner2004; J. A. Gray Reference Gray2004; Grossberg Reference Grossberg1999; Harleß Reference Harleß1861; James Reference James1890; Müller Reference Müller1843; Woodworth Reference Woodworth1915), the contents of consciousness are influenced primarily by perceptual-based (and not motor-based) events and processes, because motor processes are largely unconscious. There is substantial phenomenological evidence for this hypothesis. During action, for example, one is unconscious of the efference to the muscles that dictates which fibers should be activated at which time (Rosenbaum Reference Rosenbaum, Yantis and Yantis2002). Although one is unconscious of these complex programs (Johnson & Haggard Reference Johnson and Haggard2005), one is often aware of their proprioceptive and perceptual consequences (e.g., perceiving the hand grasping; Fecteau et al. Reference Fecteau, Chua, Franks and Enns2001; Fourneret & Jeannerod Reference Fourneret and Jeannerod1998; Gottlieb & Mazzoni Reference Gottlieb and Mazzoni2004; J. A. Gray Reference Gray2004; Heath et al. Reference Heath, Neely, Yakimishyn and Binsted2008; Liu et al. Reference Liu, Chua and Enns2008; Rossetti Reference Rossetti and Grossenbacher2001). These images tend to be perceptual-like images of action outcomes (Hommel Reference Hommel2009; Jeannerod Reference Jeannerod2006; Pacherie Reference Pacherie2008): “In perfectly simple voluntary acts there is nothing else in the mind but the kinesthetic idea … of what the act is to be” (James Reference James1890, p. 771).Footnote 11 It seems that we do not have direct, conscious access to motor programs, to syntax, to aspects of executive control (Crick Reference Crick1995; Suhler & Churchland Reference Suhler and Churchland2009; Tallon-Baudry Reference Tallon-Baudry2012; van Gaal et al. Reference van Gaal, Ridderinkhof, Fahrenfort, Scholte and Lamme2008), or to other kinds of “efference generators” (Grossberg Reference Grossberg1999; Morsella & Bargh Reference Morsella and Bargh2010b; Rosenbaum Reference Rosenbaum, Yantis and Yantis2002), including those for emotional systems (e.g., the amygdala; Anderson & Phelps Reference Anderson and Phelps2002; LeDoux Reference LeDoux1996; Öhman et al. Reference Öhman, Carlsson, Lundqvist and Ingvar2007; Olsson & Phelps Reference Olsson and Phelps2004). (Unconscious executive control from activated action sets exemplifies what has been historically referred to as “imageless,” determining tendencies; Ach Reference Ach and Rapaport1905/1951.)

In line with the sensorium hypothesis, examination of the liaison between action and consciousness reveals an isomorphism regarding what one is conscious of when one is (a) observing one's own action, (b) anticipating an action effect, (c) dreaming, and (d) observing the behaviors of others (Graziano Reference Graziano2010). In every case, it is the same, perceptual-like dimension of the experience that constitutes what is consciously available (Farrer et al. Reference Farrer, Frey, Van Horn, Tunik, Turk, Inati and Grafton2008; Melcher et al. Reference Melcher, Winter, Hommel, Pfister, Dechent and Gruber2013; Morsella & Bargh Reference Morsella and Bargh2010b; Rizzolatti et al. Reference Rizzolatti, Sinigaglia and Anderson2008; Sperry Reference Sperry1952). Speech processing provides a compelling example. Consider the argument by Levelt (Reference Levelt1989) that, of all of the processes involved in language production, one is conscious of only a subset of the processes, whether when speaking aloud or subvocalizing. (Language reveals that mechanisms in action production can be complex but unconscious, as in the case of syntax.) It is for this reason that, when speaking, one often does not know exactly which words one will utter next until the words are uttered or subvocalized following lexical retrieval (Levelt Reference Levelt1989; Slevc & Ferreira Reference Slevc and Ferreira2006). For instance, in the phonological loop, it is the phonological representation and not, say, the motor-related, “articulatory code” (Ford et al. Reference Ford, Gray, Faustman, Heinks and Mathalon2005) that one is conscious of during spoken or subvocalized speech (Buchsbaum & D'Esposito Reference Buchsbaum and D'Esposito2008; Fodor Reference Fodor1998; Rizzolatti et al. Reference Rizzolatti, Sinigaglia and Anderson2008). It is for this reason that Buchsbaum (Reference Buchsbaum2013) concluded that, in the phonological loop, the “inner voice” (i.e., the articulatory code) cannot hear itself. Although there has been substantial debate regarding the nature of conscious representations (e.g., whether they are “analogical” or “propositional”; Markman Reference Markman1999), few would argue about the isomorphism among the conscious contents experienced while acting (e.g., saying “hello”), dreaming (e.g., saying “hello” in a dream), or observing the action of another (e.g., hearing “hello”).

3.1.1. Perceptual-like contents as the lingua franca of action systems

Building on the sensorium hypothesis, we encountered a second reason why conscious contents must be perceptual-like. This reason pertains to the nature of the representational format. Regarding collective influence, the format of conscious contents must permit the contents to influence action systems (Freeman Reference Freeman2004) if there is to be perception-to-action translations (Merker Reference Merker, Shimon, Tomer and Zach2012; W. Prinz Reference Prinz, Maasen, Prinz and Roth2003b). With this in mind, one would expect that the nature of representations involved in consciousness would be capable of being received and “understood” (i.e., to be access general; Barrett Reference Barrett2005) by multiple action systems in the brain. The perceptual-like representations discussed above in section 3.1 happen to meet this criterion. It has been proposed a priori, and for reasons having nothing to do with the current theorizing, that the representations that are the most “broadcastable” (i.e., received and understood by the most brain systems) happen to be perceptual in nature (Fodor Reference Fodor1983; Morsella & Bargh Reference Morsella and Bargh2010b; Morsella et al. Reference Morsella, Lanska, Berger and Gazzaley2009b).Footnote 12 Moreover, one could argue that, if contents are aimed at influencing the complex and unconscious action mechanisms of the skeletomotor output system, it makes sense that the format of these contents would be the format to which the skeletomotor output system evolved to respond (i.e., perceptual stimuli). Accordingly, the phylogenetically old response systems in the skeletomotor output system (e.g., allowing for a spider stimulus to trigger a startle response; Rakison & Derringer Reference Rakison and Derringer2008) are likely to have evolved to deal with this kind of representation (i.e., one reflecting external objects; Bargh & Morsella Reference Bargh and Morsella2008; LeDoux Reference LeDoux1996). Thus, perceptual-like representations can be construed as a kind of (domain general) lingua franca that can lead to content-driven activations in the skeletomotor output system. In other words, the mechanisms in the skeletomotor output system do not possess access specificity to contents in the conscious field (because they have access to all of the contents in the conscious field), but they do possess processing specificity (as each action mechanism can be activated by only some contents; Barrett Reference Barrett2005).

3.2. Tenet: Conscious contents can directly activate action processes in the skeletal muscle output system

According to ideomotor theory (Greenwald Reference Greenwald1970; Harleß Reference Harleß1861; Hommel Reference Hommel2009; Hommel et al. Reference Hommel, Müsseler, Aschersleben and Prinz2001; James Reference James1890; Lotze Reference Lotze1852), the perceptual representations identified by the sensorium hypothesis provide a mechanism for goal-directed action control. In this theory, the mental image of the (perceptual-like) action effects (in the body or in the world) of an instrumental action leads to the execution of that action, with the motor programming involved being unconscious. (It is noteworthy that contemporary ideomotor accounts are agnostic regarding the role of consciousness in action control [e.g., Hommel Reference Hommel2013].)

In ideomotor accounts, action selection is thus driven by the selection of the representation of the perceptual consequences of a motoric act. Hence, the many conscious contents about the world and the body can be construed as “action options” for the skeletomotor output system. From this standpoint, the urge to move the arm leftward is isomorphic to the perceptual consequences of what would be observed if the act were performed. This is also the case for the “higher” abilities, such as language. For example, before making an important toast (or making a toast in an unmastered language), a person has conscious imagery regarding the words to be uttered. Thus, action selection is concerned with achieving a final end state (e.g., flicking a switch or saying “hello”), which can be realized in multiple ways, as in the case of motor equivalence (Lashley Reference Lashley and Kluver1942), in which several different behaviors can lead to the same end state. The unconscious motor programs realizing these end states are complex and context sensitive, as in the case of co-articulation in speech (Levelt Reference Levelt1989; see also Zhang & Rosenbaum Reference Zhang and Rosenbaum2008).

According to ideomotor theory, there is a direct link between activation of action-related perceptual processes and (unconscious) action systems. Such a link is consistent with overwhelming evidence demonstrating that the presentation of action-related perceptual stimuli automatically and systematically influences action processing (see reviews of evidence in: Ellis Reference Ellis, Morsella, Bargh and Gollwitzer2009; Hommel & Elsner Reference Hommel, Elsner, Morsella, Bargh and Gollwitzer2009). This is evident in classic paradigms such as the flanker (Eriksen & Eriksen Reference Eriksen and Eriksen1974) and Stroop tasks (Stroop Reference Stroop1935). In the latter, participants must name the color in which words are written. When the color and word name do not match (e.g., RED in blue font), response interference arises because the automatic (and unintentional) word-reading plan competes with the weaker (and intended) color-naming plan (Cohen et al. Reference Cohen, Dunbar and McClelland1990). Behavioral and psychophysiological evidence reveals that, during such response interference, competition involves simultaneous activation of the brain processes associated with both the target- and distracter-related responses (Coles et al. Reference Coles, Gratton, Bashore, Eriksen and Donchin1985; DeSoto et al. Reference DeSoto, Fabiani, Geary and Gratton2001; Eriksen & Schultz Reference Eriksen and Schultz1979; Mattler Reference Mattler2005; McClelland Reference McClelland1979; van Veen et al. Reference van Veen, Cohen, Botvinick, Stenger and Carter2001). Additional evidence stems from neurological conditions (see review in Morsella & Bargh Reference Morsella, Bargh, Cacioppo and Decety2011) and in the aforementioned research on unconscious efference binding, in which subliminal stimuli influence motor responses (Hallett Reference Hallett2007).

3.3. Tenet: Action selection as the result of inter-representational dynamics

From an ideomotor standpoint, once an action goal (e.g., pressing a button) is selected, unconscious motor efference enacts the action directly.Footnote 13 From this standpoint, that which prevents the activation of an action goal representation from directly influencing overt action is only the activation of an incompatible action goal (James Reference James1890; W. Prinz et al. Reference Prinz, Aschersleben, Koch, Morsella, Bargh and Gollwitzer2009). In this framework, conscious representations of one's finger flexing, for instance, automatically lead to the flexing of one's finger, unless representations of incompatible action effects (e.g., the finger not flexing; James Reference James1890) happen to be activated. It is important to note that the incompatibility regarding these two action effects resides, not in the conscious field, in which both action effects could be represented simultaneously, but rather in the simultaneous execution of the two action plans.

Consistent with this view of action conflicts, in one scenario, a conflict may involve representations A and B (associated with neural correlates A NC and B NC), and then, at a later time and in a different context, a conflict may involve representations C and D (associated with neural correlates C NC and D NC; Curtis & D'Esposito Reference Curtis, D'Esposito, Morsella, Bargh and Gollwitzer2009). Importantly, the two conflicts involve separate cognitive and neural processes, suggesting that “no single area of the brain is specialized for inhibiting all unwanted actions” (Curtis & D'Esposito Reference Curtis, D'Esposito, Morsella, Bargh and Gollwitzer2009, p. 72). Instead, representations, including those of action sets (Fuster Reference Fuster2003; Grafman & Krueger Reference Grafman, Krueger, Morsella, Bargh and Gollwitzer2009) and rules (E. K. Miller Reference Miller2000) compete for the control of action. Such competition between action-related representations is evident in the aforementioned Stroop Task (Stroop Reference Stroop1935).

In our approach, this arrangement in which the contents of the conscious field lead to the activation of multiple (and often competing) action plans causes one to appreciate that, in the skeletomotor output system, there must be a (unconscious) mechanism by which one action plan can influence behavior more than other activated action plans. Such a mechanism would ensure that when holding one's breath while underwater, for example, the action plan to refrain from inhaling would influence behavior more than that of inhaling, although the conscious field would represent both inclinations. Appreciation of such potential “bottlenecks” in action selection can serve as a valuable constraint on theorizing regarding the neural structures underlying consciousness.

Importantly, in the perception-to-action loop, consciousness represents conflicts and not necessarily the representations associated with the resolution of such conflicts, should such representations exist (Morsella Reference Morsella2005). This peculiar property of consciousness arises because consciousness is about a stage of processing reflecting action options and not the mechanisms that, should they exist, represent conflict resolution. This illuminates why Chomsky (Reference Chomsky1988) observes that humans, unlike machines, are not only compelled to act one way or another but also can be inclined to act a certain way. Again, such inclinations could be construed as action options. The resolution of conflict depends not on some general property of consciousness, but on the peculiarities (e.g., relative strengths) of the systems that happen to be in conflict (Skinner Reference Skinner1953). Consciousness only permits that conflicts occur; it does not aim to resolve them (Morsella Reference Morsella2005). Each conflict is idiosyncratic and, if it is to be resolved, must require post-conscious, content-specific algorithms (e.g., one in which overt behavior is influenced most by prepotent action plans; Gold & Shadlen Reference Gold and Shadlen2007; Logan et al. Reference Logan, Yamaguchi, Schall and Palmeri2015). Hence, it is challenging to arrive at general principles for predicting the outcomes of conflicts involving different systems (Campbell & Misanin Reference Campbell and Misanin1969; Krauzlis et al. Reference Krauzlis, Bollimunta, Arcizet and Wang2014; see model of countermanding in Logan et al. [2015]). The Internet provides a good analogy for the role of consciousness in conflict: The Internet permits two people from different cities to debate, but it cannot resolve conflicts between them. Another analogy would be an interpreter that translates for two parties which are in conflict about some issue. The interpreter is necessary for the instantiation of the conflict and for its potential resolution; the interpreter, however, cannot resolve the conflict.

In summary, to advance the identification of the neural substrates of consciousness, it is essential to keep in mind that consciousness is a phenomenon associated with perceptual-like processing and interfacing with the somatic nervous system (Fig. 1).

Figure 1. The divisions of the nervous system and place of consciousness within the system (based on Poehlman et al. Reference Poehlman, Jantz and Morsella2012), including the major divisions of the Somatic and Autonomic systems. Afference binding within systems can be unconscious. Although response systems can influence action directly, as in the case of unintegrated actions, only in virtue of consciousness can multiple response systems influence action collectively, as when one holds one's breath while underwater.

3.4. Neural evidence supports the sensorium hypothesis

The sensorium hypothesis and ideomotor theory reveal that, in terms of stages of processing, that which characterizes conscious content is the notion of perceptual afference (information arising from the world that affects sensory-perceptual systems; Sherrington Reference Sherrington1906) and corollary discharges (e.g., when subvocalizing; cf. Chambon et al. Reference Chambon, Wenke, Fleming, Prinz and Haggard2013; Christensen et al. Reference Christensen, Lundbye-Jensen, Geertsen, Petersen, Paulson and Nielsen2007; Jordan Reference Jordan2009; Obhi et al. Reference Obhi, Planetta and Scantlebury2009; Scott Reference Scott2013), both of which are cases of perceptual-like content. This hypothesizing is consistent with the idea that, insofar as consciousness must always contain some content (Brentano Reference Brentano1874; Fodor Reference Fodor1980; Reference Fodor1998; J. A. Gray Reference Gray1995; Reference Gray2004; Hume Reference Hume and Selby-Bigge1739/1888; Koch Reference Koch2004; Schopenhauer Reference Schopenhauer1818/1819; Sergent & Dehaene Reference Sergent and Dahaene2004), then it is parsimonious to propose that the neural regions responsible for processing that content must be part of the neural correlate of consciousness for that content. Thus, if content X is in consciousness, then the circuits processing content X must be part of a neural correlate of consciousness (e.g., at least of X). (Of course, within such an arrangement, it may be that the region[s] processing the particular content need not be the region[s] in which that content becomes associated with the conscious field; content processing could arise in one locus of the network, but the participation of contents in the conscious field could arise at another locus of the network.) With this notion in mind, we turn to the neural evidence regarding conscious contents.

Consistent with the sensorium hypothesis, there is evidence implicating perceptual brain regions as the primary regions responsible for consciousness. For example, direct electrical stimulation of parietal areas gives rise to the conscious urge to perform an action, and increased activation makes subjects believe that they actually executed the corresponding action, even though no action was performed (Desmurget et al. Reference Desmurget, Reilly, Richard, Szathmari, Mottolese and Sirigu2009; Desmurget & Sirigu Reference Desmurget and Sirigu2010; see also Farrer et al. Reference Farrer, Frey, Van Horn, Tunik, Turk, Inati and Grafton2008). However, activating motor areas (e.g., premotor regions) leads to the expression of the actual action, but subjects believe that they did not perform any action whatsoever (see also Fried et al. Reference Fried, Katz, McCarthy, Sass, Williamson, Spencer and Spencer1991). Importantly, consistent with our foregoing conclusions, the urge to perform a motor act is associated with activation of perceptual regions.

In accord with the sensorium hypothesis, the majority of studies involving brain stimulation and consciousness have found that stimulation of perceptual (e.g., posterior) brain areas leads to changes in consciousness (e.g., haptic hallucinations). This should not be surprising given that these regions were identified as “perceptual” in the first place by the use of self-report during brain stimulation (e.g., Penfield & Roberts Reference Penfield and Roberts1959). Self-report usually involves consciousness (see discussion in Bayne Reference Bayne, Clark, Vierkant and Kiverstein2013). In the literature, we found only one datum in which brain stimulation of a frontal (non-olfactory) area led to a conscious content. In this study (Fried et al. Reference Fried, Katz, McCarthy, Sass, Williamson, Spencer and Spencer1991, cited in Haggard Reference Haggard2008), weak electrical stimulation of the pre-supplementary motor area led to the experience of the urge to move a body part, with stronger stimulation leading to movement of the same body part. It has been proposed that such activation led to corollary discharge that was then “perceived” by perceptual areas (Chambon et al. Reference Chambon, Wenke, Fleming, Prinz and Haggard2013; Farrer et al. Reference Farrer, Frey, Van Horn, Tunik, Turk, Inati and Grafton2008; Iacoboni Reference Iacoboni, Hurley and Chater2005; Iacoboni & Dapretto Reference Iacoboni and Dapretto2006; Lau et al. Reference Lau, Rogers and Passingham2007; Melcher et al. Reference Melcher, Winter, Hommel, Pfister, Dechent and Gruber2013; Scott Reference Scott2013), which would be consistent with the sensorium hypothesis. One strong hypothesis from this line of theorizing is that activations in regions that are non-perceptual or motor should never (independent of corollary discharge) influence the conscious field.

Consistent with the sensorium hypothesis and ideomotor theory, research reveals that a key component of the control of intentional action is feedback about ongoing action plans to perceptual areas of the brain, such as post-central cortex (Berti & Pia Reference Berti and Pia2006; Chambon et al. Reference Chambon, Wenke, Fleming, Prinz and Haggard2013; Desmurget et al. Reference Desmurget, Reilly, Richard, Szathmari, Mottolese and Sirigu2009; Farrer et al. Reference Farrer, Frey, Van Horn, Tunik, Turk, Inati and Grafton2008; Iacoboni Reference Iacoboni, Hurley and Chater2005; Miall Reference Miall2003). With this information in mind, it has been proposed that consciousness is associated not with frontal or higher-order perceptual areas, but with lower-order perceptual areas (J. R. Gray et al. Reference Gray, Bargh and Morsella2013; Liu et al. Reference Liu, Paradis, Yahia-Cherif and Tallon-Baudry2012; Tallon-Baudry Reference Tallon-Baudry2012). However, it is important to qualify that though the sensorium hypothesis specifies that consciousness involves neural circuits that, traditionally, have been associated with perception, such circuits are widespread throughout the brain and exist within both cortical and subcortical regions (Merker Reference Merker, Shimon, Tomer and Zach2012). Hence, the sensorium hypothesis is consistent with several neuroanatomical accounts of consciousness, including cortical, subcortical (e.g., thalamic), and thalamocortical accounts of consciousness. Thus, on the basis of the sensorium hypothesis alone, it is premature to dismiss subcortical accounts of consciousness (e.g., Merker Reference Merker2007; Penfield & Jasper Reference Penfield and Jasper1954; Ward Reference Ward2011).

In conclusion, at the present stage of understanding, the literature provides no clear answer regarding the neural substrates of any kind of conscious content (see treatment in Merker Reference Merker2013b; Reference Merker2013c). Based on the foregoing conclusions about conscious contents, we believe that, to illuminate this issue further, progress can be made by adopting an EASE perspective and focusing on a (relatively) tractable perceptual region – namely, that of the understudied olfactory system.

3.5. Tenet: The olfactory system provides clues regarding the neural correlates of conscious perceptual content in the sensorium

Our EASE perspective led us to the sensorium hypothesis. Now, with the same perspective, we focus on one kind of content in the sensorium. As noted in section 1.4, when attempting to isolate the substrates of a conscious content, researchers have followed Crick and Koch's (Reference Crick and Koch2003) recommendation and focused on vision. It is clear that isolating the neuroanatomical substrate of a visual conscious content remains controversial. From an EASE perspective, and based on previous research (Merrick et al. Reference Merrick, Godwin, Geisler and Morsella2014; Morsella & Bargh Reference Morsella and Bargh2007), we focus our attention instead on olfaction (see also Keller Reference Keller2011), a phylogenetically old system whose circuitry appears to be more tractable and less widespread in the brain than that of vision or higher-level processing such as music perception. As Shepherd (Reference Shepherd2007) concludes, “the basic architecture of the neural basis of consciousness in mammals, including primates, should be sought in the olfactory system, with adaptations for the other sensory pathways reflecting their relative importance in the different species” (p. 93).

Several features of this system render it a fruitful arena in which to isolate the substrates of consciousness. First, olfaction involves a primary processing area that consists of paleocortex (which contains only half of the number of layers of neocortex) and primarily only one brain region (the frontal cortex; Shepherd Reference Shepherd2007). In contrast, vision and audition often involve large-scale interactions between frontal cortex and parietal cortices. These observations reveal the relative simplicity of the anatomy of the olfactory system compared to that of other systems. Second, regarding the cortical-subcortical controversy, olfaction can reveal much about the contribution of thalamic nuclei in the generation of consciousness: Unlike most sensory modalities, afferents from the olfactory sensory system bypass the first-order, relay thalamus and directly target the cortex ipsilaterally (Shepherd & Greer Reference Shepherd, Greer and Shepherd1998; Tham et al. Reference Tham, Stevenson and Miller2009). This minimizes spread of circuitry, permitting one to draw conclusions about the necessity of first-order thalamic relays in (at least) this form of consciousness.

By studying olfaction, one can also draw some conclusions about second-order thalamic relays (e.g., the mediodorsal thalamic nucleus [MDNT]). After cortical processing, the MDNT receives inputs from olfactory cortical regions (Haberly Reference Haberly and Shepherd1998). Although it is likely that the MDNT plays a significant role in olfactory discrimination (Eichenbaum et al. Reference Eichenbaum, Shedlack and Eckmann1980; Slotnick & Risser Reference Slotnick and Risser1990; Tham et al. Reference Tham, Stevenson and Miller2011), olfactory identification, and olfactory hedonics (Sela et al. Reference Sela, Sacher, Serfaty, Yeshurun, Soroker and Sobel2009), as well as in more general cognitive processes including memory (Markowitsch Reference Markowitsch1982), learning (Mitchell et al. Reference Mitchell, Baxter and Gaffan2007), and attentional processes (Tham et al. Reference Tham, Stevenson and Miller2009; Reference Tham, Stevenson and Miller2011), we have found no evidence that a lack of olfactory consciousness results from lesions of any kind to the MDNT (see theorizing about this possibility in Plailly et al. [Reference Plailly, Howard, Gitelman and Gottfried2008]). Regarding second-order thalamic relays such as the MDNT, one must keep in mind that, in terms of circuitry, these nuclei are similar in nature to first-order relays (Sherman & Guillery Reference Sherman and Guillery2006), which are quite simple compared to, say, a cortical column.

Consistent with “cortical” theories of consciousness, Cicerone and Tanenbaum (Reference Cicerone and Tanenbaum1997) observed complete anosmia (the loss of the sense of smell) in a patient with a lesion to the left orbital gyrus of the frontal lobe. In addition, a patient with a right orbitofrontal cortex (OFC) lesion experienced complete anosmia (Li et al. Reference Li, Lopez, Osher, Howard, Parrish and Gottfried2010), suggesting that the OFC is necessary for olfactory consciousness. (It is worth mentioning that we are speaking of the OFC with respect to, not the high-level executive processes with which it has been associated, but, consistent with the sensorium hypothesis, its perceptual processing [i.e., olfactory perception].) Moreover, conscious aspects of odor discrimination have been attributed to the activities of the frontal and orbitofrontal cortices (Buck Reference Buck, Kandel, Schwartz and Jessell2000). Keller (Reference Keller2011) concludes, “There are reasons to assume that the phenomenal neural correlate of olfactory consciousness is found in the neocortical orbitofrontal cortex” (p. 6; see also Mizobuchi et al. Reference Mizobuchi, Ito, Tanaka, Sako, Sumi and Sasaki1999). (According to Barr and Kiernan [Reference Barr and Kiernan1993], olfactory consciousness depends on the piriform cortex.) However, not all lesions of the OFC have resulted in anosmia: Zatorre and Jones-Gotman (Reference Zatorre and Jones-Gotman1991) reported a study in which OFC lesions yielded severe deficits, yet all patients demonstrated normal olfactory detection.

Another output pathway from the piriform cortex projects to the insular cortex (Haberly Reference Haberly and Shepherd1998; Schoenbaum & Eichenbaum Reference Schoenbaum and Eichenbaum1995), a structure that has anatomical connections to the ventral posteromedial (VPM) nucleus of the thalamus (Price et al. Reference Price, Carmichael, Carnes, Clugnet, Kuroda, Ray, Davis and Eichenbaum1991). In light of (a) this information, (b) the conclusions presented above about the MDNT, and (c) theories in which thalamic structures play an important role in consciousness (e.g., Joliot et al. Reference Joliot, Ribary and Llinás1994; Llinás & Ribary Reference Llinás and Ribary2001; Llinás et al. Reference Llinás, Ribary, Contreras and Pedroarena1998; Ward Reference Ward2011), one could propose that olfactory consciousness depends on the integrity of the insula and thalamus. However, regarding the thalamus, it has been observed that, though thalamic lesions can impair olfactory discrimination and complex olfactory learning (Eichenbaum et al. Reference Eichenbaum, Shedlack and Eckmann1980; Martin Reference Martin2013), such lesions, including those of the VPM, never result in anosmia (Martin Reference Martin2013; Price Reference Price1985; Price et al. Reference Price, Carmichael, Carnes, Clugnet, Kuroda, Ray, Davis and Eichenbaum1991; Sela et al. Reference Sela, Sacher, Serfaty, Yeshurun, Soroker and Sobel2009). The lesion literature also reveals an additional important fact about olfactory consciousness. Olfactory consciousness does not require the involvement of any transthalamic pathway. In addition, for corticocortical connections, the olfactory system requires no “higher-order” (Sherman & Guillery Reference Sherman and Guillery2006) thalamic relays (e.g., the MDNT or VPM; Gottfried Reference Gottfried, Hummel and Welge-Lüssen2006; Price Reference Price1985; Price et al. Reference Price, Carmichael, Carnes, Clugnet, Kuroda, Ray, Davis and Eichenbaum1991). Considering these characteristics, Gottfried (Reference Gottfried, Hummel and Welge-Lüssen2006) concludes, “The most parsimonious explanation for this anatomical variation is an evolutionary one: As primitive paleocortex, the olfactory circuitry simply developed long before the emergence of a thalamic module” (p. 53). These peculiar neuroanatomical characteristics are unique to olfactory consciousness.

Regarding the role of the insula in olfactory consciousness, after reviewing the literature, we concur with Mak et al. (Reference Mak, Simmons, Gitelman and Small2005) that there is no evidence that anosmia results from damage of any kind (e.g., unilateral or bilateral lesions) to the insular cortex: “There are no reports of olfactory deficits resulting from damage to the insula” (p. 1693; see also Damasio et al. Reference Damasio, Damasio and Tranel2012; Philippi et al. Reference Philippi, Feinstein, Khalsa, Damasio, Tranel, Landini, Williford and Rudrauf2012; Tranel & Welsh-Bohmer Reference Tranel and Welsh-Bohmer2012).

Taken together, the neuroanatomical evidence presented above leads one to conclude that, in order to advance the current understanding of the neural underpinnings of consciousness, the next hypothesis to falsify is that olfactory consciousness requires cortical processes. This hypothesis is far from obvious, and it is falsifiable, because there are strong empirically based frameworks (e.g., Damasio Reference Damasio1999; Merker Reference Merker2007; Panksepp Reference Panksepp1998) proposing that consciousness is a function of subcortical processes. When these frameworks are integrated with our present treatment of the liaison between consciousness and olfactory circuits, our hypothesis could be proven to be inaccurate. For example, it might be that olfactory percepts are elaborated at a cortical level but become conscious only at some subcortical level (e.g., in the brainstem). Such a falsification of our hypothesis would advance our understanding of consciousness and the brain. Figuratively speaking, falsifying this particular “cortical” hypothesis provides the “lowest hanging fruit” for identifying the neural substrates of consciousness. In this way, the olfactory system can be used as a test-bed for hypotheses stemming from the cortical-subcortical controversy.

Third, from an EASE perspective, there are phenomenological and cognitive/mechanistic properties that render this system a fruitful network in which to investigate consciousness. Regarding phenomenological properties, unlike what occurs with other modalities, olfaction regularly yields no subjective experience of any kind when the system is under-stimulated, as when odorants are in low concentration, or during sensory habituation. This “experiential nothingness” (Morsella et al. Reference Morsella, Krieger and Bargh2010) is more akin to the phenomenology of the blind spot than to what one experiences when visual stimulation is absent (darkness). In the latter case, there still exists a conscious, visual experience (e.g., that of a black field). The experiential nothingness associated with olfaction yields no conscious contents of any kind to such an extent that, absent memory, one in such a circumstance would not know that one possessed an olfactory system. Hence, for our purposes, the creation of a conscious olfactory content is a true “addition” to the conscious field in that, not only does it involve the introduction of information about a particular stimulus, but also it involves the addition, from one moment to the next, of an entire modality. (See additional advantages of studying olfactory consciousness in Note 1.)

For these reasons, olfaction provides the best portal for understanding the neural correlates of additions to the conscious field. In our “creature in the cave” example, the smell of smoke is an addition to the conscious field that influences skeletomotor responses toward other conscious contents (e.g., the visual percept of the opening). Examining the neural correlates of such an addition might provide more evidence for the integration consensus. For example, it has been hypothesized that one becomes conscious of an olfactory percept only when the representation is part of a wider network involving other systems (Cooney & Gazzaniga Reference Cooney and Gazzaniga2003), such as motor (Mainland & Sobel Reference Mainland and Sobel2006) or semantic-linguistic (Herz Reference Herz2003) systems. (See review of the relationship between neural oscillations and olfactory consciousness in Merrick et al. [2014].)

In conclusion, regarding neuroanatomy, our primary hypothesis is that consciousness is associated with what has traditionally been regarded as “perceptual” regions of the brain, a hypothesis that challenges some accounts of consciousness in which consciousness is associated with executive processes in frontal cortex (e.g., Boly et al. Reference Boly, Garrido, Gosseries, Bruno, Boveroux, Schnakers, Massimini, Litvak, Laureys and Friston2011; Dehaene & Naccache Reference Dehaene and Naccache2001; Lau Reference Lau2008; Panagiotaropoulos et al. 2012; Safavi et al. Reference Safavi, Kapoor, Logothetis and Panagiotaropoulos2014; Velly et al. Reference Velly, Rey, Bruder, Gouvitsos, Witjas, Regis, Peragut and Gouin2007). Our secondary hypothesis is that olfactory consciousness can be constituted entirely by cortical circuits.

4. The generation of conscious contents and field dynamics

4.1. Tenet: Content generation is encapsulated

In our “creature in the cave” example, the addition of an olfactory content to the conscious field just “happens,” without any noteworthy effort on the part of the organism (Mainland & Sobel Reference Mainland and Sobel2006). The content arises from a particular configuration of afference (e.g., the unconscious visual and auditory afference in the McGurk effect) to what can be construed as a content generator (associated with a perceptual region). Traditionally, these content generators (e.g., for color) have been construed as “modules” (Fodor Reference Fodor1983). Such a configuration of afference may include not only bottom-up afference, but also afference from unconscious top-down processes from knowledge systems and from frontal control regions (Suhler & Churchland Reference Suhler and Churchland2009; Tallon-Baudry Reference Tallon-Baudry2012). Importantly, these generative processes that create conscious content are themselves context sensitive and unconscious (e.g., as in the McGurk effect; Lamme & Spekreijse Reference Lamme and Spekreijse2000). Regarding context sensitivity, consider that the image of a snake on a television screen triggers little if any fear, but such is not the case in a natural context.

Usually, contents enter consciousness in a manner that is more automatic, and less driven by intentions of the experiencing “agent,” than appears to be the case in the everyday life of us pensive humans (Tallon-Baudry Reference Tallon-Baudry2012; Vierkant Reference Vierkant, Clark, Kiverstein and Vierkant2013). Often, contents tend to “just happen” (Vierkant Reference Vierkant, Clark, Kiverstein and Vierkant2013). In line with these views, Helmholtz (Reference Helmholtz and Shipley1856/1961) proposed that unconscious processes can generate conscious content in a manner that resembles reflexes and other unintentional actions. When speaking about such “unconscious inferences,” Helmholtz was referring not only to the generation of the conscious contents associated with low-level perceptual processes such as depth perception, but also to higher-level, non-perceptual processes such as automatic word reading – an unnatural, intellectual process that requires years of training. Helmholtz noted that when one is confronted with an orthographic stimulus (e.g., HOUSE), the stimulus automatically triggers a conscious representation of the phonological form of the word (i.e., /haus/). Seldom in everyday life is it appreciated that, in this situation, the visual stimulus triggers a conscious content that is very different in nature from that of the environmental stimulation that brought the content into existence: The conscious representation of the phonological form of the word is associated not with the visual modality, but with audition (Levelt Reference Levelt1989).

Conscious content can be generated by unconscious inferences also in the case of action-related urges (e.g., from unconsciously generated corollary discharge). These urges are often triggered in a predictable and insuppressible manner. For example, when one holds one's breath while underwater or runs barefoot across the hot desert sand in order to reach water, one cannot help but consciously experience the inclinations to inhale or to avoid touching the hot sand, respectively (Morsella Reference Morsella2005). Regardless of the adaptiveness of the expressed actions, the conscious strife triggered by the external stimuli cannot be turned off voluntarily (Morsella Reference Morsella2005; Öhman & Mineka Reference Öhman and Mineka2001). In these cases, the externally activated action-related urges are, in a sense, insulated, or “encapsulated” (Fodor Reference Fodor1983), from voluntary control. Thus, although inclinations triggered by external stimuli can be behaviorally suppressed, they often cannot be mentally suppressed (Bargh & Morsella Reference Bargh and Morsella2008). One can think of many cases in which externally triggered conscious contents are more difficult to control than overt behavior (Allen et al. Reference Allen, Wilkins, Gazzaley and Morsella2013).

It has been argued that it is adaptive for content generation to be encapsulated in this way and for conscious contents to be incapable of directly influencing each other in the conscious field (Firestone & Scholl Reference Firestone and Scholl2014; Merrick et al. Reference Merrick, Godwin, Geisler and Morsella2014; Rolls et al. Reference Rolls, Judge and Sanghera1977). From this standpoint, the conscious, perceptual representations for instrumental action should be unaffected by the organism's beliefs or motivational states (Bindra Reference Bindra1974; Reference Bindra1978). As Rolls and Treves (Reference Rolls and Treves1998) conclude, “It would not be adaptive, for example, to become blind to the sight of food after we have eaten it to satiety” (p. 144). Similarly, it would not be adaptive for contents pertaining to incentive/motivational states to be influenced directly by other contents, such as desires and beliefs (Baumeister et al. Reference Baumeister, Vohs, DeWall and Zhang2007). For example, if one's beliefs could lead one to voluntarily “turn off” pain, guilt, or hunger, then these negative states would lose their adaptive value. Although motivation and beliefs may contaminate higher-order processes such as memory, they should have little influence over perceptual contents (Cooper et al. Reference Cooper, Sterling, Bacon and Bridgeman2012; Firestone & Scholl Reference Firestone and Scholl2014; Pylyshyn Reference Pylyshyn1984; Reference Pylyshyn1999). Such “cross-contamination” across contents would compromise the critical influence of such incentive/motivational states on behavior.

Thus, each content is independent of other contents in the conscious field, whether the contents arise from environmental stimulation or from memory. Specifically, a conscious content (e.g., “da” in the McGurk effect) cannot directly influence the nature of other contents already in the conscious field (e.g., the smell of a rose, a toothache; Morsella Reference Morsella2005). (Of course, this is not to mean that the configuration of afference engendering one content cannot influence the generation of other contents – a form of context sensitivity in afference processing that occurs unconsciously [Lamme & Spekreijse Reference Lamme and Spekreijse2000; Merker Reference Merker, Shimon, Tomer and Zach2012].) Because of encapsulation, illusions persist despite one's knowledge regarding the actual nature of the stimuli (Firestone & Scholl Reference Firestone and Scholl2014; Pylyshyn Reference Pylyshyn1984).

It could be said that a given content does not “know” about its relevance to other contents (including high-level, knowledge-based contents) or to current action. When representing a food object, for example, the content does not know whether the food item will be eaten or, instead, be thrown as a weapon. This view stands in contrast to several influential theoretical frameworks in which both the activation of, and nature of, conscious contents are influenced by what can be regarded as overarching goals or current task demands (e.g., Banerjee et al. Reference Banerjee, Chatterjee and Sinha2012; Bhalla & Proffitt Reference Bhalla and Proffitt1999; Bruner Reference Bruner1973; Bruner & Postman Reference Bruner and Postman1949; Dehaene Reference Dehaene2014; Meier et al. Reference Meier, Robinson, Crawford and Ahlvers2007; Stefanucci & Geuss Reference Stefanucci and Geuss2009). Because of the principle of encapsulation, conscious contents cannot influence each other either at the same time nor across time, which counters the everyday notion that one conscious thought can lead to another conscious thought.

In the present framework, not only do contents not influence each other in the conscious field, but also as Merker (personal communication, June 30, Reference Merker, Shimon, Tomer and Zach2012) concludes, content generators cannot communicate the content they generate to another content generator. For example, the generator charged with generating the color orange cannot communicate “orange” to any other content generator, because only this generator (a perceptual module) can, in a sense, understand and instantiate “orange.” Hence, if the module charged with a particular content is compromised, then that content is gone from the conscious field, and no other module can “step in” to supplant that content (Kosslyn et al. Reference Kosslyn, Ganis and Thompson2001). As Merker notes, in constructing the conscious field, modules can send, not messages with content, but only “activation” to each other (see also Lamme & Spekreijse Reference Lamme and Spekreijse2000). This activation, in turn, influences whether the receiver module will generate, not the kind of content generated by the module from which it received activation, but rather its own kind of content (e.g., a sound). Because messages of content cannot be transmitted to other content generators, the neural correlates of the content for X must include activation of the module that generates X, because a given content cannot be segregated from the process by which it was engendered, as stated previously.

4.2. Tenet: Field contents must meet multiple-constraint satisfaction, be unambiguous, and appear as if apprehended from a first-person perspective

From an EASE perspective, one can ask: What does a conscious content require if it is to lead to adaptive action? To answer this question, one must first consider that, in the case of object perception (such as the opening in our “creature in the cave” example), representations must be veridical to some extent in order to render action adaptive (Firestone & Scholl Reference Firestone and Scholl2014). For example, during action selection, it would not be adaptive for a circular object to be represented with, say, straight lines and corners. Similarly, it would not be adaptive for an object on the left to be represented as if it were on the right. Therefore, for the conscious field to afford adaptive action, it must represent with some veracity the spatial relation of those objects to the organism (Gibson Reference Gibson1979). Under normal circumstances, the contents of the conscious field at each moment are complete and unambiguous. Accordingly, Merker (Reference Merker, Shimon, Tomer and Zach2012) concludes that, because of the very nature of the constitution of the systems giving rise to conscious sensory representations, these systems are incapable of representing stimulus ambiguity (e.g., as in the Necker cube), at least at one moment in time. (However, such ambiguity could exist in unconscious perceptual processing [Merker Reference Merker, Shimon, Tomer and Zach2012].) Thus, a given content emerges from polysensory configurations of afference, as in the McGurk effect, leading to the “global best estimate” of what that content should be (Helmholtz Reference Helmholtz and Shipley1856/1961; Merker Reference Merker, Shimon, Tomer and Zach2012).

Such well-constructed contents could stem from (a) the proposed, unconscious mechanisms of “multiple drafts” (Dennett Reference Dennett1991); (b) the interpretative processes of “apperception” (Wundt Reference Wundt and Titchener1902/1904); or (c) “reentrant processing,” in which a module, in order to give rise to a conscious representation, must receive feedback activation from other modules about that representation (Lamme Reference Lamme2001; Pascual-Leone & Walsh Reference Pascual-Leone and Walsh2001; Tong Reference Tong2003). For example, if visual modules X and Y construct a representation which leads to the activation of other modules, then that representation becomes conscious only after feedback activation from the other modules returns to X and Y (Di Lollo et al. Reference Di Lollo, Enns and Rensink2000; Fahrenfort et al. Reference Fahrenfort, Scholte and Lamme2007; Goodhew et al. Reference Goodhew, Dux, Lipp and Visser2012; Grossberg Reference Grossberg1999; Hamker Reference Hamker2003; Kriegel Reference Kriegel2007; Lamme Reference Lamme2001; Lee et al. Reference Lee, Kim, Noh, Choi, Hwang and Mashour2009; Llinás et al. Reference Llinás, Ribary, Contreras and Pedroarena1998; Pascual-Leone & Walsh Reference Pascual-Leone and Walsh2001; Tong Reference Tong2003). Re-entrant processing may instantiate a kind of “checks-and-balances” system for constructing accurate conscious contents that satisfy the criteria of multiple modules, a form of multiple-constraint satisfaction (Dennett Reference Dennett1991; Merker Reference Merker, Shimon, Tomer and Zach2012). In addition, feedback of this sort may underlie the phenomenon of “contextual modulation” (e.g., in figure-ground effects; Lamme & Spekreijse Reference Lamme and Spekreijse2000). More simply, this feedback may be necessary because conscious contents may require (a) high levels of activation (Kinsbourne Reference Kinsbourne, Cohen and Schooler1996) or (b) sustained activation for a prolonged period (Lau Reference Lau and Gazzaniga2009), both of which can be furnished by sustained reverberation (Hebb Reference Hebb1949). In summary, for a content generator to contribute to the conscious field and for its contents to be well-crafted, it may require the concurrent activation from both feed-forward and feedback mechanisms (Lamme & Spekreijse Reference Lamme and Spekreijse2000).

The conscious field of our “creature in the cave” includes representations of urges and external objects, which incorporate the relation between such things and the organism itself (Lehar Reference Lehar2003; Merker Reference Merker, Shimon, Tomer and Zach2012; Yates Reference Yates1985). More generally, contents are usually sensed to be different from, and separate from, the observing agent (Brentano Reference Brentano1874; Merker Reference Merker, Shimon, Tomer and Zach2012; Schopenhauer Reference Schopenhauer1818/1819). Insofar as the action selection process of the skeletomotor output system must take into account spatial distance from the organism as one of the many factors in adaptive selection, then all contents about the external world (including the body) must have a common, egocentric reference (Merker Reference Merker2013c). It would be disadvantageous for this rule to be violated and for, again, an object on the left to be represented as if on the right. Hence, most conscious contents appear as if from a first-person perspective (Gibson Reference Gibson1979; Merker Reference Merker2013c; J. Prinz Reference Prinz, Velmans and Schneider2007). The conscious field is imbued with this first-person perspective during waking, in dreaming, and for illusions in which, through clever experimental manipulations and the presentation of certain stimuli, the perspective is momentarily perceived as if from outside of the body (Ehrsson Reference Ehrsson2007).Footnote 14 From this standpoint, the demands of adaptive action selection require the creation of a first-person perspective, which is a primitive form of “self.”

4.3. Tenet: The conscious field serves as a frame that represents encapsulated contents for collective influence over, not itself, but the skeletal muscle output system

It seems that the conscious field is like a mosaic of discrete, heterogeneous contents, rendering the field to be combinatorial. Each content is well-crafted and unambiguous (Dietrich & Markman Reference Dietrich and Markman2003; Freeman Reference Freeman2004; Köhler Reference Köhler1947; Merker Reference Merker, Shimon, Tomer and Zach2012; Scholl Reference Scholl2001). The contents cannot directly influence each other,Footnote 15 and the content generators cannot duplicate the generative abilities of each other. Thus, the resultant contents from these modules are encapsulated from each other. These mosaic-like Gestalts (i.e., the conscious field) arise in consciousness in a discontinuous manner, with each conscious moment, lasting for fractions of a second, having an updated version of all of the contents in the conscious field. For action to be adaptive, the refresh rate of the entire field must be faster than the quickest rate at which voluntary actions can be produced. Hence, the refresh must occur more rapidly than the rate at which the fastest actions (e.g., saccades) can be emitted (Merker Reference Merker2013c). (For theorizing regarding the temporal properties of such an updating process, see Libet [Reference Libet2004] and Merker [Reference Merker, Shimon, Tomer and Zach2012, p. 56; 2013c, p. 12].)

Importantly, the collective influence of the combination of contents in the conscious field is not toward the conscious field itself; instead, according to PRISM, the conscious field is apprehended by the (unconscious) mechanisms composing the skeletomotor output system. Thus, the conscious contents of blue, red, a smell, or the urge to blink are the tokens of a mysterious language understood, not by consciousness itself (nor by the physical world), but by the unconscious action mechanisms of the skeletomotor output system. Why do things appear the way they do in the field? Because, in order to benefit action selection, they must differentiate themselves from all other tokens in the field – across various modalities/systems but within the same decision space.

Although possessing elements of a “Cartesian Theater” (Dennett Reference Dennett1991), this arrangement does not introduce the “homunculus fallacy,” because, in the skeletomotor output system, there are many motor-homunculi, each incapable of duplicating the functions of the system as a whole (Dennett Reference Dennett1991). Unlike the “workspace” models associated with the integration consensus (e.g., Baars Reference Baars1988; Dehaene Reference Dehaene2014), which propose that conscious representations are “broadcast” to modules engaged in both stimulus interpretation and content generation, in our framework (as in Merker Reference Merker2007), the contents of the conscious field are directed only at response modules in the skeletomotor output system. In short, conscious contents are “sampled” only by unconscious action systems that are charged with (specifically) skeletal muscle control.

5. Passive frame theory: An action-based synthesis

No activity of mind is ever conscious.

— Lashley (Reference Lashley1956, p. 4)

It is the result of thinking, not the process of thinking, that appears spontaneously in consciousness.

— George A. Miller (Reference Miller1962, p. 56)

To reiterate, progress on the study of consciousness and the brain has suffered not so much from a lack of data, but from a lack of a suitable, internally coherent framework with which to interpret extant data (Grossberg Reference Grossberg1987). To provide such a framework, we now synthesize all of our tenets.

Consciousness is a phenomenon serving the somatic nervous system (subset consensus); it is in the service of adaptive skeletomotor function (PRISM; see Figure 1). At each moment, the conscious field is generated anew, with a new medley of contents. PRISM predicts the kinds of information that must become conscious contents. These kinds of information are about things (e.g., external objects and urges) that should influence the skeletomotor output system (the “steering wheel” associated with consciousness). To the organism, these unambiguous, well-crafted, and highly context-sensitive contents usually arise in an automatic, non-self-generated manner. The contents, which arise from configurations of afference (including top-down processes and unconscious intersensory interactions), are encapsulated from each other: One content does not “know” whether it is relevant to other contents or to ongoing action (encapsulation). Consciousness can thus be construed as a “continuous feed” system that is always “on,” even in the absence of conflict or of other forms of cross-system checking (Morsella Reference Morsella2005). In other words, the primary function of the conscious field is collective influence of otherwise encapsulated contents on the skeletomotor output system. Such an influence is essential especially under conditions of conflict; however, as a continuous feed system, this mechanism of collective influence persists even under conditions in which conflict is absent.

The contents (e.g., objects and urges) are “perceptual-like,” which is the “common format” apprehended by the action-related mechanisms composing the skeletomotor output system (sensorium hypothesis). Conscious contents are sampled only by unconscious action systems (PRISM). These contents can be construed as “action options.” Absent conflict, these action options activate unconscious efferences to the skeletomotor output system (ideomotor theory). Unconscious mechanisms such as unconscious inferences and corollary discharges from activated action plans (e.g., in the phonological loop; Scott Reference Scott2013) can trigger conscious contents. We refer to the interdependence between unconscious and conscious mechanisms as the conscious-unconscious cycle.

The conscious field permits collective influence of all of the heterogeneous contents upon the skeletomotor output system (PRISM). All influences on skeletomotor behavior, from the highest-level processes (e.g., language) to the lowest-level processes (e.g., pain withdrawal), must engage the skeletomotor output system.

For the selection of any skeletomotor plan to be adaptive, selection must transpire in the frame of the other conscious contents composing the conscious field at that instant. We refer to this as a “frame check.” It is required for adaptive skeletomotor function, and is essential for integrated actions in the context of conflict. Under certain circumstances (e.g., fast motor acts such as rapid eye gazes), frame checks must occur quickly, as last-minute changes to courses of action often arise in the face of new information (Merker Reference Merker2013c). Hence, frame checks must occur more rapidly than the rate at which the fastest actions (e.g., saccades) can be emitted (Merker Reference Merker2013c). During adaptive action selection, anticipated action effects, actual action effects, and information about the immediate environment must exist as comparable tokens in a common decision-space. Although consciousness has historically been associated with the highest levels of processing, here it is revealed that consciousness must occur at the level of processing that is shared with that of representations of the immediate external environment. Consciousness is associated only with frame checks and not with the more active aspects of the conscious-unconscious cycle (e.g., content generation, conflict resolution, motor programming).

With these conclusions in mind, we now return to our “creature in the cave” scenario. Because of encapsulation, the percept of the opening of the cave is consciously available even when the opening is not relevant to ongoing action (i.e., before detection of the smoke). Regarding neural events, the content addition of the olfactory stimulus involves a wide network of regions (integration consensus). To the organism, the olfactory content “just happens.” Before the content, there was no olfactory consciousness; hence, the smell of smoke is a “true addition.” Because messages of content cannot be transmitted, the olfactory content must involve perceptual areas of processing (sensorium hypothesis). The neuroanatomical correlates of such an olfactory content remain controversial (e.g., the cortical-subcortical controversy). We believe that progress regarding such controversies will stem from further examination of the cortical aspects of olfactory consciousness.

For our “creature in the cave,” the conscious content about the smell triggers a conscious content representing an avoidance tendency toward the smell. Specifically, this content about potential action is about the perceptual aspects of the to-be-produced action (sensorium hypothesis). Such a conscious content about an action can arise from activations in perceptual areas triggered by corollary discharges from unconscious, motor processes (Buchsbaum Reference Buchsbaum2013; Chambon et al. Reference Chambon, Wenke, Fleming, Prinz and Haggard2013; Iacoboni Reference Iacoboni, Hurley and Chater2005; Iacoboni & Dapretto Reference Iacoboni and Dapretto2006; Lau et al. Reference Lau, Rogers and Passingham2007). Again, as with the case of (a) anticipated action effects, (b) actual action effects, and (c) information about the immediate environment, adaptive action selection requires that the conscious contents associated with both Stage 1 (e.g., the percept of the opening and the warmth) and Stage 2 (e.g., the smell and the inclination to stay in the cave) be, in terms of their functional consequences for action selection, the same kind of thing – comparable tokens existing in the same decision space. Thus, the conscious field permits for the contents about the smell and about the opening to influence action collectively.

During the frame check, the content about the potential action to exit conflicts with the content about the inclination to remain within the warm enclosure. In this case, the conflict between remaining in the enclosure and exiting is consciously experienced by the organism; however, this is only one component of all that is transpiring within the mental apparatus as a whole. Representations reflecting the outcome of the conflict (should they exist) reside outside of consciousness (which itself is primarily about action options). Such a resolution will reflect not any property of consciousness, but the peculiarities of the conflicting systems involved.

In terms of action selection, the conscious field could be construed as the evolutionary strategy for dealing with what the ethologists and behaviorists referred to as a complex of multiple discriminative stimuli (also known as a compound discriminative stimulus), in which the “stimulus control” from one discriminative stimulus depends systematically on the nature of the other discriminative stimuli composing the scene (Spear & Campbell Reference Spear and Campbell1979). In collective influence, the response to one conscious content – and the “meaning” of that content for ongoing action selection – depends exclusively on the nature of itself and the other conscious contents at that moment in time. Thus, the conscious field permits for the response to one content to be made in light of the other contents (Tsushima et al. Reference Tsushima, Sasaki and Watanabe2006), thereby yielding integrated behavior. In this process, the conscious field wholly and exclusively determines what in everyday life is called voluntary behavior.

We now apply these insights to a case involving a higher-level system (language). At Thanksgiving dinner, our simple organism (now equipped with language) perceives a stimulus that triggers (unintentionally and automatically) in consciousness the action option of uttering a famous saying. This perceptual-like subvocalized phonological content, which “just happens” in consciousness, could stem from complex processes, involving, perhaps, corollary discharge from unconscious motor centers (Mathalon & Ford Reference Mathalon and Ford2008; Scott Reference Scott2013). After a frame check, the organism does not utter the phrase, because, a moment after experiencing the subvocalization, the organism experienced another content (the smell of hot chocolate) that led to an action plan (drinking) that was incompatible with speaking. The foregoing example reveals how the reiterative cycle of conscious field construction, and the frame check that each construction affords, yields the collective influence that adaptive skeletomotor action demands.

6. Implications and concluding remarks

Passive frame theory reveals that the province of consciousness in nervous function is more low-level, circumscribed, counterintuitive, and passive than what theorists have proposed previously. Because conscious contents do not know their relevance to other contents nor to ongoing action, consciousness is less purposeful at one moment in time than what intuition suggests. It is not only less purposeful and “all-knowing” than expected, but also contributes only one function (albeit an essential function) to a wide range of processes, much as how the Internet plays the same critical role for a varied group of events (e.g., the sale of books or dissemination of news), and the human eye, though involved in various processes (e.g., hunting vs. locomoting), always performs the same function. Because consciousness contributes to a wide range of heterogeneous processes, it appears as being capable of doing more than it actually does.

Passive frame theory also reveals that the contribution of consciousness to nervous function is best understood from a passive, structural-based (instead of a processing-based) approach. Such a perspective is in contrast to contemporary approaches but is in accord with historical ways to describe how biological systems achieve their ends (Grafman & Krueger Reference Grafman, Krueger, Morsella, Bargh and Gollwitzer2009). Figuratively speaking, at one moment in time, there are few “moving parts” in the conscious field. (The field itself has no memory and performs no symbol manipulation; for these high-level mechanisms, it only presents, to action systems, the outputs of dedicated memory systems and of executive processes, respectively.)

Compared to the many functions attributed to consciousness, that which is proposed here is by far the most basic, low-level function. Because consciousness integrates processes from various systems, this role is more apparent when studying consciousness across modalities than when studying it within only one modality. Hence, the province of consciousness is best appreciated from a “systems-level” approach to the brain. The conscious-unconscious cycle of our approach also reveals the interdependence between (passive) conscious and (active) unconscious processes. (Few approaches examine the interactions between the two kinds of processes.) Last, our approach reveals that the demands of adaptive action (e.g., heterogeneous action systems must use the same effector) and the limitations of the cognitive apparatus (e.g., action selection often must occur quickly), illuminate many of the properties of consciousness, including that of a basic form of the sense of self.

With our framework as a foundation, future investigations could lead to a consensus regarding, for example, the neural circuitry underlying consciousness. Because the identification of the dynamic, neural processes associated with consciousness depend in part on identification of the neuroanatomical correlates of consciousness, a consensus should first be reached regarding the latter, more tractable problem. The most feasible way to reach such an identification is for investigators to cooperate across fields and attempt to isolate the correlates of consciousness at multiple levels of analysis, with increased research activity devoted to regions (a) predicted a priori, by theory, to be associated with consciousness and (b) identified as being the most experimentally tractable, as in the case of olfactory circuits. Because consciousness serves as a special interface between sensory and motor processing, theory-based predictions regarding the relationship between conscious contents and the skeletomotor output system (e.g., the link between olfactory percepts and integrated skeletomotor behavior) may advance attempts to identify the substrates of consciousness. We hope that our framework serves as a useful foundation for such collective endeavors.

Together with our EASE perspective, passive frame theory provides a fecund and internally coherent framework for the study of consciousness. Based on hypotheses from diverse areas of investigation, our synthesis reveals how consciousness serves an essential, integrative role for the somatic nervous system, a role that is intimately related to adaptive functioning of skeletal muscle (one of many effector systems in the body). When not subscribing to an EASE perspective, one could imagine ways in which the proposed contribution of consciousness to the somatic system could be realized without anything like “subjectivity.” However, these musings would reflect our human powers of imagination more than what was selected in evolution for nervous function to yield adaptive behavior.

ACKNOWLEDGMENTS

This article is dedicated to John Bargh, who reminded everyone about the power and sophistication of the unconscious. This research was supported by David Matsumoto and the Center for Human Culture and Behavior at San Francisco State University. We gratefully acknowledge Christina Merrick's assistance with the review of the olfaction literature, and Ricardo Morsella's guidance regarding the philosophical literature. We are also grateful for the advice and suggestions of T. Andrew Poehlman and Lawrence Williams, and for the assistance and support from Robert M. Krauss, Bruce L. Miller, John Paul Russo, Mark Geisler, Avi Ben-Zeev, Björn Merker, Carlos Montemayor, Allison Allen, Bernhard Hommel, Gary Marcus, J. Scott Jordan, Wolfgang Prinz, Bruce Bridgeman, and Chelsea Hunt. This project would not have been possible without the efforts of the members of the Action and Consciousness Laboratory.

Ezequiel Morsella is a theoretician and experimentalist who has devoted his entire career to investigating the differences in the brain between the conscious and unconscious circuits in the control of human action. He is the lead author of Oxford Handbook of Human Action (2009, Oxford University Press) and was the editor of the Festschrift in honor of Robert M. Krauss. His research has appeared in leading journals in neuroscience and experimental psychology, including Psychological Review, Neurocase, and Consciousness and Cognition. After his undergraduate studies at the University of Miami, he carried out his doctoral and postdoctoral studies at Columbia University and Yale University, respectively.

Christine A. Godwin is currently a Ph.D. student in the School of Psychology at the Georgia Institute of Technology. She received her Master of Arts in Psychological Research from San Francisco State University, where she co-authored several publications in the area of action and consciousness. Her current work focuses on the temporal dynamics of memory encoding processes and cognitive control.

Tiffany K. Jantz is currently a Ph.D. student of Psychology with an emphasis in cognition and cognitive neuroscience at the University of Michigan. She received her Master of Arts in Psychological Research from San Francisco State University, where she was awarded the Graduate Student Award for Distinguished Achievement in Psychology. She has published in the area of cognition, including consciousness, imagery, working memory, attention, and cognitive aging.

Stephen C. Krieger, MD, is an Associate Professor of Neurology at the Icahn School of Medicine at Mount Sinai, where he is the Director of the Neurology Residency Program and a specialist in multiple sclerosis. A noted neurology educator, he has won numerous mentorship and teaching awards at Mount Sinai, and received an American Academy of Neurology A. B. Baker teaching recognition award in 2010. His research has emphasized examining diagnosis and misdiagnosis in multiple sclerosis specifically, and clinical neurology more broadly.

Adam Gazzaley, MD, Ph.D., is Professor in Neurology, Physiology, and Psychiatry at the University of California, San Francisco, the founding director of the Neuroscience Imaging Center, and director of the Gazzaley Lab. His laboratory's most recent studies explore neuroplasticity and how we can optimize our cognitive abilities via engagement with custom-designed video games, as well as how this can be bolstered by closed-loop systems using neurofeedback and transcranial electrical stimulation. He has filed multiple patents based on his research, authored more than 100 scientific articles, and delivered more than 425 invited presentations around the world.

Footnotes

1. First, unlike in vision, there are few executive functions (e.g., mental rotation, symbol manipulation) coupled with olfaction. Hence, in olfaction, one is less likely to conflate the substrates of consciousness with those of high-level executive functions (see Aru et al. Reference Aru, Bachmann, Singer and Melloni2012; Panagiotaropoulos et al. Reference Panagiotaropoulos, Kapoor and Logothetis2013). Similarly, in vision and audition, imagery can be used to preserve information in working memory through active rehearsal (Baddeley Reference Baddeley2007), but olfactory images are difficult to couple with such operations (Betts Reference Betts1909/1972; Brower Reference Brower1947; Lawless Reference Lawless, Beauchamp and Bartoshuk1997; Stevenson Reference Stevenson2009). Second, olfactory experiences are less likely to occur in a self-generated, stochastic manner: Unlike with vision and audition, in which visually rich daydreaming or “earworms” occur spontaneously during an experiment and can contaminate psychophysical measures (respectively), little if any self-generated olfactory experiences could contaminate measures. Last, olfaction is more segregated from the semantic system than is the most studied sensory system – vision. In the latter, there are deep, inextricable relationships among perception, conceptualization, and semantics (Barsalou Reference Barsalou1999; Kosslyn et al. Reference Kosslyn, Thompson and Ganis2006). Thus, when isolating the neural substrates of olfactory consciousness, one is less likely to include higher-level processes (e.g., semantic processes) associated with more than just olfactory consciousness. (See additional advantages of studying olfactory consciousness in section 3.5.)

2. Consistent with this consensus, evidence reveals that consciousness of some kind persists with the nonparticipation (e.g., because of lesions) of several brain regions (Morsella et al. Reference Morsella, Krieger and Bargh2010): cerebellum (Schmahmann Reference Schmahmann1998), amygdala (Anderson & Phelps Reference Anderson and Phelps2002; LeDoux Reference LeDoux1996), basal ganglia (Bellebaum et al. Reference Bellebaum, Koch, Schwarz and Daum2008; Ho et al. Reference Ho, Fitz, Chuang and Geyer1993), mammillary bodies (Duprez et al. Reference Duprez, Serieh and Reftopoulos2005; Tanaka et al. Reference Tanaka, Miyazawa, Akaoka and Yamada1997), insula (Damasio 2011, as cited in Voss Reference Voss2011; see also Damasio Reference Damasio2010), and hippocampus (Crick & Koch Reference Crick and Koch1990; Milner Reference Milner, Whitty and Zangwill1966; Postle Reference Postle, Laureys and Tononi2009). In addition, investigations of “split-brain” patients (O'Shea & Corballis Reference O'Shea and Corballis2005; Wolford et al. Reference Wolford, Miller, Gazzaniga and Gazzaniga2004) suggest that consciousness survives following the nonparticipation of the non-dominant (usually right) cerebral cortex or of the commissures linking the two cortices. Controversy surrounds the hypothesis that cortical matter is necessary for consciousness (discussed in sect. 2.3 and sect. 3.5).

3. For present purposes, unconscious events are those processes that, though capable of systematically influencing behavior, cognition, motivation, and emotion, do not influence the organism's subjective experience in such a way that the organism can directly detect, understand, or self-report the occurrence or nature of these events (Morsella & Bargh Reference Morsella, Bargh, Weiner and Craighead2010a).

4. See review in Morsella and Bargh (Reference Morsella, Bargh, Cacioppo and Decety2011). In brief, unconsciously mediated actions can be observed during unconscious states, including forms of coma/persistent vegetative states (Kern et al. Reference Kern, Jaradeh, Arndorfer and Shaker2001; Klein Reference Klein1984; Laureys Reference Laureys2005; Pilon & Sullivan Reference Pilon and Sullivan1996) and epileptic seizures, in which automatisms arise while the actor appears to be unconscious. These unconscious automatisms include motor acts (Kokkinos et al. Reference Kokkinos, Zountsas, Kontogiannis and Garganis2012; Kutlu et al. Reference Kutlu, Bilir, Erdem, Gomceli, Kurt and Serdaroglu2005), written and spoken (nonsense) utterances (Blanken et al. Reference Blanken, Wallesch and Papagno1990; Kececi et al. Reference Kececi, Degirmenci and Gumus2013), singing (Doherty et al. Reference Doherty, Wilensky, Holmes, Lewis, Rae and Cohn2002; Enatsu et al. Reference Enatsu, Hantus, Gonzalez-Martinez and So2011), and rolling, pedaling, and jumping (Kaido et al. Reference Kaido, Otsuki, Nakama, Kaneko, Kubota, Sugai and Saito2006). Similarly, in neurological conditions in which a general consciousness is spared but actions are decoupled from consciousness, as in alien hand syndrome (Bryon & Jedynak Reference Bryon and Jedynak1972; Chan & Ross Reference Chan and Ross1997), anarchic hand syndrome (Marchetti & Della Sala Reference Marchetti and Della Sala1998), and utilization behavior syndrome (Lhermitte Reference Lhermitte1983), hands and arms carry out complex actions autonomously. These actions include complex goal-directed behaviors such as object manipulations (Yamadori Reference Yamadori, Ito, Miyashita and Rolls1997). Often, the behaviors are unintentional (Marchetti & Della Sala Reference Marchetti and Della Sala1998; Suzuki et al. Reference Suzuki, Itoh, Arai, Kouno, Noguchi, Takatsu and Takeda2012). (See other forms of unconscious action in Bindra [Reference Bindra1974], Milner and Goodale [Reference Milner and Goodale1995], Weiskrantz [Reference Weiskrantz1992, Reference Weiskrantz1997], and Westwood [Reference Westwood, Morsella, Bargh and Gollwitzer2009].) In addition, actions can arise from stimuli of which the actor is unaware, as in the case of subliminal stimuli that, though imperceptible, can influence action (sect. 2.4; Ansorge et al. Reference Ansorge, Neumann, Becker, Kalberer and Cruse2007; Hallett Reference Hallett2007).

5. In binocular rivalry (Logothetis & Schall Reference Logothetis and Schall1989), an observer is presented with different visual stimuli to each eye (e.g., an image of a house in one eye and of a face in the other). Surprisingly, however, an observer experiences seeing only one object at time (a house and then a face), even though both images are always present.

6. Regarding neural oscillations, for example, controversy remains about the role that they play in the generation of consciousness. It has been proposed that cortical electroencephalography does not reflect conscious processing (Merker Reference Merker2013b). In addition, there is controversy concerning the regions that are responsible for these oscillations. Consciousness-related oscillations have been proposed to reflect primarily (a) thalamic activity (Ward Reference Ward2011), (b) thalamocortical activity (Joliot et al. Reference Joliot, Ribary and Llinás1994; Llinás & Ribary Reference Llinás and Ribary2001; Llinás et al. Reference Llinás, Ribary, Contreras and Pedroarena1998), and (c) corticocortical activity (Panagiotaropoulos et al. 2012; Schubert et al. Reference Schubert, Blankenburg, Lemm, Villringer and Curio2006). (Regarding the role of oscillations in consciousness, see Aru & Bachmann Reference Aru and Bachmann2009; Crick & Koch Reference Crick and Koch1990; Doesburg et al. Reference Doesburg, Kitajo and Ward2005; Reference Doesburg, Green, McDonald and Ward2009; Engel & Singer Reference Engel and Singer2001; Fries Reference Fries2005; Hameroff Reference Hameroff2010; Jung-Beeman et al. Reference Jung-Beeman, Bowden, Haberman, Frymiare, Arambel-Liu, Greenblatt, Reber and Kounios2004; Meador et al. Reference Meador, Ray, Echauz, Loring and Vachtsevanos2002; Panagiotaropoulos et al. 2012; Singer Reference Singer, Laureys and Tononi2011; Uhlhass et al. 2009; Wessel et al. Reference Wessel, Haider and Rose2012.)

7. Action selection, as when one presses one button versus another button or moves leftward versus rightward, is distinct from motor control/motor programming (Proctor & Vu Reference Proctor, Vu, Weiner and Craighead2010), processes which are largely unconscious (discussed in sect. 3.1).

8. Experiments have revealed that incompatible skeletomotor intentions (e.g., to point right and left, to inhale and not inhale) produce systematic intrusions into consciousness (J. R. Gray et al. Reference Gray, Bargh and Morsella2013; Molapour et al. Reference Molapour, Berger and Morsella2011; Morsella et al. Reference Morsella, Wilson, Berger, Honhongva, Gazzaley and Bargh2009c), but no such changes accompany conflicts involving smooth muscle (Morsella et al. Reference Morsella, Gray, Krieger and Bargh2009a) or occur at perceptual stages of processing (e.g., intersensory processing; see quantitative review of evidence from multiple paradigms in Morsella et al. [2011]). Additional evidence stems from the observation that consciousness is required to counteract the interference effects of conflict (Desender et al. Reference Desender, van Opstal and van den Bussche2014).

9. Bleuler (Reference Bleuler and Brill1924) proposed that that which transforms unconscious processes to conscious processes is more than just integration – the integration must involve the “ego complex.” We propose that this complex is related to volition and the skeletomotor output system.

10. It is important to specify the notion of an addition of content to the conscious field. It has been proposed that consciousness cannot be “content free” but must always possess some content (Brentano Reference Brentano1874; Fodor Reference Fodor1980; Reference Fodor1998; J. A. Gray Reference Gray1995; Reference Gray2004; Hume Reference Hume and Selby-Bigge1739/1888; Koch Reference Koch2004; Schopenhauer Reference Schopenhauer1818/1819; Sergent & Dehaene Reference Sergent and Dahaene2004), such as that of a perceptual object or an action-related urge. We adopt this assumption. However, it has also been stated that contents enter consciousness, as if consciousness were a bucket into which contents enter. Thus, there is a contradiction: According to one interpretation, there can be no bucket without contents, but, according to the other interpretation, consciousness (i.e., the bucket) could exist independent of contents. Perhaps it is more accurate and parsimonious to state that new contents do not enter consciousness but become conscious, thereby joining other conscious contents. Hence, when something becomes conscious (e.g., the smell of smoke), we regard it as an “addition” to the conscious field. If consciousness is capacity-limited, then at times an addition may also be construed as a replacement, because the new content removes other content (see evidence in Goodhew et al. Reference Goodhew, Dux, Lipp and Visser2012). It is controversial whether contents in the conscious field actually constitute the field or modulate it. (For a treatment concerning whether the field is componential or unitary, see Searle [2000].) Importantly, in the present model, whether the field is componential or unitary leads to the same functional consequences, because of the encapsulation of conscious contents (sect. 4.1).

11. According to a minority of theorists (see list in James Reference James1890, p. 772), one is conscious of the efference to the muscles (what Wundt called the feeling of innervation; see James Reference James1890, p. 771). In contrast, James (Reference James1890) staunchly proclaimed, “There is no introspective evidence of the feeling of innervation” (p. 775). This efference was believed to be responsible for action control (see review in Sheerer Reference Sheerer, Prinz and Sanders1984). (Wundt later abandoned the feeling-of-innervation hypothesis; see Klein Reference Klein1970.)

12. This proposal is based in part on the requirements of “isotropic information,” which are beyond the scope of the present discussion (see Fodor Reference Fodor1983). As noted by Fodor (Reference Fodor2001), in order to solve the “input” (or “routing”) problem, in which the appropriate representations must be made available to the appropriate information-processing modules, the representations must be perceptual in nature (Barrett Reference Barrett2005; Barsalou Reference Barsalou1999).

13. Perhaps, in addition to action selection, a “go signal” is required for action initiation (Bullock & Grossberg Reference Bullock and Grossberg1988). The need for such a mechanism is beyond the scope of the present framework.

14. From an EASE perspective, it is parsimonious to treat the sense of agency, too, as a conscious content that is experimentally manipulable. It is experienced when there is the perception of a lawful correspondence between action intentions and action outcomes (Wegner Reference Wegner2002) and depends in part on conceptual processing (Berti & Pia Reference Berti and Pia2006; David et al. Reference David, Newen and Vogeley2008; Haggard Reference Haggard2005; Reference Haggard2008; Jeannerod Reference Jeannerod2009; Synofzik et al. Reference Synofzik, Vosgerau and Newen2008).

15. In forms of metacognition (e.g., indirect cognitive control; Morsella et al. Reference Morsella, Lanska, Berger and Gazzaley2009b), there are cases in which, through top-down control, conscious contents can lead to the strategic activation of other contents; however, this mechanism is beyond the abilities of our simple organism and, for the present purposes, unnecessary for appreciating the primary role of consciousness.

References

Ach, N. (1905/1951) Determining tendencies: Awareness. In: Organization and pathology of thought, ed. Rapaport, D., pp. 1538. Columbia University Press (Original work published in 1905).Google Scholar
Aleman, B. & Merker, B. (2014) Consciousness without cortex: A hydranencephaly family survey. Acta Paediatrica 103(10):1057–65.Google Scholar
Alkire, M., Hudetz, A. & Tononi, G. (2008) Consciousness and anesthesia. Science 322(5903):876–80.Google Scholar
Allen, A. K., Wilkins, K., Gazzaley, A. & Morsella, E. (2013) Conscious thoughts from reflex-like processes: A new experimental paradigm for consciousness research. Consciousness and Cognition 22:1318–31.CrossRefGoogle ScholarPubMed
Allman, J. M. (2000) Evolving brains. Scientific American Library.Google Scholar
Anderson, A. K. & Phelps, E. A. (2002) Is the human amygdala critical for the subjective experience of emotion? Evidence of intact dispositional affect in patients with amygdala lesions. Journal of Cognitive Neuroscience 14:709–20.Google Scholar
Ansorge, U., Neumann, O., Becker, S., Kalberer, H. & Cruse, H. (2007) Sensorimotor supremacy: Investigating conscious and unconscious vision by masked priming. Advances in Cognitive Psychology 3:257–74.CrossRefGoogle Scholar
Arkin, R. C. (1998) Behavior-based robotics. MIT Press.Google Scholar
Aru, J. & Bachmann, T. (2009) Occipital EEG correlates of conscious awareness when subjective target shine-through and effective visual masking are compared: Bifocal early increase in gamma power and speed-up of P1. Brain Research 1271:6073.Google Scholar
Aru, J., Bachmann, T., Singer, W. & Melloni, L. (2012) Distilling the neural correlates of consciousness. Neuroscience and Biobehavioral Reviews 36:737–46.Google Scholar
Baars, B. (1988) A cognitive theory of consciousness. Cambridge University Press.Google Scholar
Baars, B. J. (1998) The function of consciousness: Reply. Trends in Neurosciences 21(5):201.Google Scholar
Baars, B. J. (2002) The conscious access hypothesis: Origins and recent evidence. Trends in Cognitive Sciences 6(1):4752.CrossRefGoogle ScholarPubMed
Baars, B. J. (2005) Global workspace theory of consciousness: Toward a cognitive neuroscience of human experience. Progress in Brain Research 150:4553.Google Scholar
Baddeley, A. D. (2007) Working memory, thought and action. Oxford University Press.Google Scholar
Banerjee, P., Chatterjee, P. & Sinha, J. (2012) Is it light or dark? Recalling moral behavior changes perception of brightness. Psychological Science 23:407409.CrossRefGoogle ScholarPubMed
Banks, W. P. (1995) Evidence for consciousness. Consciousness and Cognition 4:270–72.CrossRefGoogle ScholarPubMed
Bargh, J. A. & Morsella, E. (2008) The unconscious mind. Perspectives on Psychological Science 3:7379.Google Scholar
Barr, M. L. & Kiernan, J. A. (1993) The human nervous system: An anatomical viewpoint, 6th edition. Lippincott.Google Scholar
Barrett, H. C. (2005) Enzymatic computation and cognitive modularity. Mind and Language 20:259–87.Google Scholar
Barsalou, L. W. (1999) Perceptual symbol systems. Behavioral and Brain Sciences 22:577609.Google Scholar
Baumeister, R. F., Vohs, K. D., DeWall, N. & Zhang, L. (2007) How emotion shapes behavior: Feedback, anticipation, and reflection, rather than direct causation. Personality and Social Psychology Review 11:167203.Google Scholar
Bayne, T. (2013) Agency as a marker of consciousness. In: Decomposing the will, ed. Clark, A., Vierkant, T. & Kiverstein, J., pp. 160–81. Oxford University Press.Google Scholar
Bellebaum, C., Koch, B., Schwarz, M. & Daum, I. (2008) Focal basal ganglia lesions are associated with impairments in reward-based reversal learning. Brain 131:829–41.CrossRefGoogle ScholarPubMed
Berti, A. & Pia, L. (2006) Understanding motor awareness through normal and pathological behavior. Current Directions in Psychological Science 15:245–50.CrossRefGoogle Scholar
Betts, G. H. (1909/1972) The distribution and functions of mental imagery. AMS Press. (Original work published in 1909 as author's thesis for Columbia University). 1972 reprint edition issued in series: Teachers College, Columbia University. Contributions to Education, No. 26.Google Scholar
Bhalla, M. & Proffitt, D. R. (1999) Visual-motor recalibration in geographical slant perception. Journal of Experimental Psychology: Human Perception and Performance 25:1076–96.Google Scholar
Bindra, D. (1974) A motivational view of learning, performance, and behavior modification. Psychological Review 81:199213.Google Scholar
Bindra, D. (1978) How adaptive behavior is produced: A perceptual-motivational alternative to response-reinforcement. Behavioral and Brain Sciences 1:4191.Google Scholar
Bizzi, E. & Mussa-Ivaldi, F. A. (2004) Toward a neurobiology of coordinate transformations. In: The cognitive neurosciences III, ed. Gazzaniga, M. S., pp. 413–25. MIT Press.Google Scholar
Blake, R. & Logothetis, N. K. (2002) Visual competition. Nature Reviews Neuroscience 3:1321.Google Scholar
Blanken, G., Wallesch, C.-W. & Papagno, C. (1990) Dissociations of language functions in aphasics with speech automatisms (recurring utterances). Cortex 26:4163.Google Scholar
Bleuler, E. (1924) Textbook of psychiatry, trans. Brill, A. A.. Macmillan.Google Scholar
Block, N. (1995b) On a confusion about a function of consciousness. (Target article) Behavioral and Brain Sciences 18(2):227–47; discussion 247–87.CrossRefGoogle Scholar
Boly, M., Garrido, M. I., Gosseries, O., Bruno, M. A., Boveroux, P., Schnakers, C., Massimini, M., Litvak, V., Laureys, S. & Friston, K. (2011) Preserved feedforward but impaired top-down processes in the vegetative state. Science 332:858–62.Google Scholar
Boveroux, P., Vanhaudenhuyse, A., Bruno, M. A., Noirhomme, Q., Lauwick, S., Luxen, A., Degueldre, C., Plenevaux, A., Schnakers, C., Phillips, C., Brichant, J. F., Bonhomme, V., Maquet, P., Greicius, M. D., Laureys, S. & Boly, M. (2010) Breakdown of within- and between-network resting state functional magnetic resonance imaging connectivity during propofol-induced loss of consciousness. Anesthesiology 113:1038–53.Google Scholar
Brentano, F. (1874) Psychology from an empirical standpoint. Oxford University Press.Google Scholar
Brower, D. (1947) The experimental study of imagery: II. The relative predominance of various imagery modalities. Journal of General Psychology 37:199200.Google Scholar
Bruner, J. S. (1973) Going beyond the information given: Studies in the psychology of knowing. Norton.Google Scholar
Bruner, J. S. & Postman, L. (1949) On the perception of incongruity: A paradigm. Journal of Personality 18:206–23.CrossRefGoogle ScholarPubMed
Bryon, S. & Jedynak, C. P. (1972) Troubles du transfert interhemispherique: A propos de trois observations de tumeurs du corps calleux. Le signe de la main etrangère. Revue Neurologique 126:257–66.Google Scholar
Buchsbaum, B. R. (2013) The role of consciousness in the phonological loop: Hidden in plain sight. Frontiers in Psychology 4, article 496. (Online publication, Open Access e- journal). doi:10.3389/fpsyg.2013.00496.Google Scholar
Buchsbaum, B. R. & D'Esposito, M. (2008) The search for the phonological store: From loop to convolution. Journal of Cognitive Neuroscience 20:762–78.Google Scholar
Buck, L. B. (2000) Smell and taste: The chemical senses. In: Principles of neural science, 4th edition, ed. Kandel, E. R., Schwartz, J. H. & Jessell, T. M., pp. 625–47. McGraw-Hill.Google Scholar
Bullock, D. & Grossberg, S. (1988) Neural dynamics of planned arm movements: Emergent invariants and speed-accuracy properties during trajectory formation. Psychological Review 95:4990.Google Scholar
Buzsáki, G. (2006) Rhythms of the brain. Oxford University Press.CrossRefGoogle Scholar
Campbell, B. A. & Misanin, J. R. (1969) Basic drives. Annual Review of Psychology 20:5784.Google Scholar
Carlson, N. R. (1994) Physiology of behavior. Allyn and Bacon.Google Scholar
Chambon, V., Wenke, D., Fleming, S. M., Prinz, W. & Haggard, P. (2013) An online neural substrate for sense of agency. Cerebral Cortex 23:1031–37.Google Scholar
Chan, J.-L. & Ross, E. D. (1997) Alien hand syndrome: Influence of neglect on the clinical presentation of frontal and callosal variants. Cortex 33:287–99.Google Scholar
Chomsky, N. (1988) Language and problems of knowledge: The Managua lectures. MIT Press.Google Scholar
Christensen, M. S., Lundbye-Jensen, J., Geertsen, S. S., Petersen, T. H., Paulson, O. B. & Nielsen, J. B. (2007) Premotor cortex modulates somatosensory cortex during voluntary movements without proprioceptive feedback. Nature Neuroscience 10:417–19.Google Scholar
Cicerone, K. D. & Tanenbaum, L. N. (1997) Disturbance of social cognition after traumatic orbitofrontal brain injury. Archives of Clinical Neuropsychology 12:173–88.Google Scholar
Clark, A. (2002) Is seeing all it seems? Action, reason and the grand illusion. Journal of Consciousness Studies 9:181202.Google Scholar
Coenen, A. M. L. (1998) Neuronal phenomena associated with vigilance and consciousness: From cellular mechanisms to electroencepalographic patterns. Consciousness and Cognition 7:4253.Google Scholar
Cohen, J. D., Dunbar, K. & McClelland, J. L. (1990) On the control of automatic processes: A parallel distributed processing account of the Stroop effect. Psychological Review 97:332–61.CrossRefGoogle ScholarPubMed
Cohen, M. A. & Dennett, D. C. (2011) Consciousness cannot be separated from function. Trends in Cognitive Sciences 15:358–64.Google Scholar
Cohen-Kdoshay, O. & Meiran, N. (2009) The representation of instructions operates like a prepared reflex: Flanker compatibility effects found in the first trial following S–R instructions. Experimental Psychology 56:128–33.Google Scholar
Coles, M. G. H., Gratton, G., Bashore, T. R., Eriksen, C. W. & Donchin, E. (1985) A psychophysiological investigation of the continuous flow model of human information processing. Journal of Experimental Psychology: Human Perception and Performance 11:529–53.Google ScholarPubMed
Cooney, J. W. & Gazzaniga, M. S. (2003) Neurological disorders and the structure of human consciousness. Trends in Cognitive Sciences 7:161–66.Google Scholar
Cooper, A. D., Sterling, C. P., Bacon, M. P. & Bridgeman, B. (2012) Does action affect perception or memory? Vision Research 62:235–40.Google Scholar
Crick, F. (1995) The astonishing hypothesis: The scientific search for the soul. Touchstone.Google Scholar
Crick, F. & Koch, C. (1990) Toward a neurobiological theory of consciousness. Seminars in the Neurosciences 2:263–75.Google Scholar
Crick, F. & Koch, C. (1995) Are we aware of neural activity in primary visual cortex? Nature 375:121–23.Google Scholar
Crick, F. & Koch, C. (1998) Consciousness and neuroscience. Cerebral Cortex 8:97107.Google Scholar
Crick, F. & Koch, C. (2003) A framework for consciousness. Nature Neuroscience 6:18.Google Scholar
Curtis, C. E. & D'Esposito, M. (2009) The inhibition of unwanted actions. In: Oxford handbook of human action, ed. Morsella, E., Bargh, J. A. & Gollwitzer, P. M., pp. 7297. Oxford University Press.Google Scholar
Damasio, A. R. (1989) Time-locked multiregional retroactivation: A systems-level proposal for the neural substrates of recall and recognition. Cognition 33:2562.Google Scholar
Damasio, A. R. (1999) The feeling of what happens: Body and emotion in the making of consciousness. Harvest Books.Google Scholar
Damasio, A. R. (2010) Self comes to mind: Constructing the conscious brain. Pantheon.Google Scholar
Damasio, A. R., Damasio, H. & Tranel, D. (2012) Persistence of feelings and sentience after bilateral damage of the insula. Cerebral Cortex 23:833–46.Google Scholar
David, N., Newen, A. & Vogeley, K. (2008) The “sense of agency” and its underlying cognitive and neural mechanisms. Consciousness and Cognition 17:523–34.Google Scholar
Dehaene, S. (2014) Consciousness and the brain: Deciphering how the brain codes our thoughts. Viking.Google Scholar
Dehaene, S. & Naccache, L. (2001) Towards a cognitive neuroscience of consciousness: Basic evidence and a workspace framework. Cognition 79(1–2):137.Google Scholar
Del Cul, A., Baillet, S. & Dehaene, S. (2007) Brain dynamics underlying the nonlinear threshold for access to consciousness. PLoS Biology 5:e260.Google Scholar
Dennett, D. C. (1991) Consciousness explained. Little, Brown.Google Scholar
Desender, K., van Opstal, F. V. & van den Bussche, E. (2014) Feeling the conflict: The crucial role of conflict experience in adaptation. Psychological Science 25:675–83.Google Scholar
Desmurget, M., Reilly, K. T., Richard, N., Szathmari, A., Mottolese, C. & Sirigu, A. (2009) Movement intention after parietal cortex stimulation in humans. Science 324(5928):811–13.Google Scholar
Desmurget, M. & Sirigu, A. (2010) A parietal-premotor network for movement intention and motor awareness. Trends in Cognitive Sciences 13:411–19.Google Scholar
DeSoto, M. C., Fabiani, M., Geary, D. C. & Gratton, G. (2001) When in doubt, do it both ways: Brain evidence of the simultaneous activation of conflicting responses in a spatial Stroop task. Journal of Cognitive Neuroscience 13:523–36.Google Scholar
de Waal, F. B. M. (2002) Evolutionary psychology: The wheat and the chaff. Current Directions in Psychological Science 11:187–91.Google Scholar
Dietrich, E. & Markman, A. B. (2003) Discrete thoughts: Why cognition must use discrete representations. Mind and Language 18:95119.Google Scholar
Di Lollo, V., Enns, J. T. & Rensink, R. A. (2000) Competition for consciousness among visual events: The psychophysics of reentrant visual pathways. Journal of Experimental Psychology: General 129:481507.Google Scholar
Doesburg, S. M., Green, J. L., McDonald, J. J. & Ward, L. M. (2009) Rhythms of consciousness: Binocular rivalry reveals large-scale oscillatory network dynamics mediating visual perception. PLoS ONE 4:e0006142.Google Scholar
Doesburg, S. M., Kitajo, K. & Ward, L. M. (2005) Increased gamma-band synchrony precedes switching of conscious perceptual objects in binocular rivalry. NeuroReport 16:1139–42.Google Scholar
Doherty, M. J., Wilensky, A. J., Holmes, M. D., Lewis, D. H., Rae, J. & Cohn, G. H. (2002) Singing seizures. Neurology 59:1435–38.Google Scholar
Duprez, T. P., Serieh, B. A. & Reftopoulos, C. (2005) Absence of memory dysfunction after bilateral mammillary body and mammillothalamic tract electrode implantation: Preliminary experience in three patients. American Journal of Neuroradiology 26:195–98.Google Scholar
Edelman, G. M. (1989) The remembered present. Basic Books.Google Scholar
Edelman, G. M. & Tononi, G. (2000) A universe of consciousness: How matter becomes imagination. Basic Books.Google Scholar
Ehrsson, H. H. (2007) The experimental induction of out-of-body experiences. Science 317(5841):1048.Google Scholar
Eichenbaum, H., Shedlack, K. J. & Eckmann, K. W. (1980) Thalamocortical mechanisms in odor-guided behavior. Brain, Behavior and Evolution 17:255–75.CrossRefGoogle ScholarPubMed
Einstein, A. & Infeld, L. (1938/1967) The evolution of physics. Cambridge University Press/Touchstone. (Original work published in 1938).Google Scholar
Ellis, R. (2009) Interactions between action and visual objects. In: Oxford handbook of human action, ed. Morsella, E., Bargh, J. A. & Gollwitzer, P. M., pp. 214–24. Oxford University Press.Google Scholar
Enatsu, R., Hantus, S., Gonzalez-Martinez, J. & So, N. (2011) Ictal singing due to left frontal lobe epilepsy: A case report and review of the literature. Epilepsy and Behavior 22:404406.Google Scholar
Engel, A. K. & Singer, W. (2001) Temporal binding and the neural correlates of sensory awareness. Trends in Cognitive Sciences 5:1625.Google Scholar
Eriksen, B. A. & Eriksen, C. W. (1974) Effects of noise letters upon the identification of a target letter in a nonsearch task. Perception and Psychophysics 16:143–49.Google Scholar
Eriksen, C. W. & Schultz, D. W. (1979) Information processing in visual search: A continuous flow conception and experimental results. Perception and Psychophysics 25:249–63.Google Scholar
Fahrenfort, J. J., Scholte, H. S. & Lamme, V. A. (2007) Masking disrupts reentrant processing in human visual cortex. Journal of Cognitive Neuroscience 19:1488–97.Google Scholar
Farrer, C., Frey, S. H., Van Horn, J. D., Tunik, E., Turk, D., Inati, S. & Grafton, S. T. (2008) The angular gyrus computes action awareness representations. Cerebral Cortex 18:254–61.Google Scholar
Fecteau, J. H., Chua, R., Franks, I. & Enns, J. T. (2001) Visual awareness and the online modification of action. Canadian Journal of Experimental Psychology 55:104–10.Google Scholar
Fehrer, E. & Biederman, I. (1962) A comparison of reaction time and verbal report in the detection of masked stimuli. Journal of Experimental Psychology 64:126–30.Google Scholar
Fehrer, E. & Raab, D. (1962) Reaction time to stimuli masked by metacontrast. Journal of Experimental Psychology 63:143–47.Google Scholar
Firestone, C. & Scholl, B. J. (2014) “Top-down” effects where none should be found: The El Greco fallacy in perception research. Psychological Science 25:3846.Google Scholar
Fodor, J. A. (1980) The language of thought. Harvard University Press.Google Scholar
Fodor, J. A. (1983) Modularity of mind: An essay on faculty psychology. MIT Press.Google Scholar
Fodor, J. A. (1998) Concepts: Where cognitive science went wrong. Oxford University Press.Google Scholar
Fodor, J. A. (2001) The mind doesn't work that way: The scope and limits of computational psychology. MIT Press.Google Scholar
Ford, J. M., Gray, M., Faustman, W. O., Heinks, T. H. & Mathalon, D. H. (2005) Reduced gamma-band coherence to distorted feedback during speech when what you say is not what you hear. International Journal of Psychophysiology 57:143–50.Google Scholar
Fourneret, P. & Jeannerod, M. (1998) Limited conscious monitoring of motor performance in normal subjects. Neuropsychologia 36:1133–40.Google Scholar
Freeman, W. J. (1991) The physiology of perception. Scientific American 264:7885.Google Scholar
Freeman, W. J. (2004) William James on consciousness, revisited. Chaos and Complexity Letters 1:1742.Google Scholar
Fried, I., Katz, A., McCarthy, G., Sass, K. J., Williamson, P., Spencer, S. S. & Spencer, D. D. (1991) Functional organization of human supplementary motor cortex studied by electrical stimulation. The Journal of Neuroscience 11(11):3656–66.Google Scholar
Friedman-Hill, S. R., Robertson, L. C. & Treisman, A. (1995) Parietal contributions to visual feature binding: Evidence from a patient with bilateral lesions. Science 269:853–55.Google Scholar
Fries, P. (2005) A mechanism for cognitive dynamics: Neuronal communication through neuronal coherence. Trends in Cognitive Sciences 9:474–80.Google Scholar
Frith, C. D. (2010) What is consciousness for? Pragmatics and Cognition 18(3):497551.Google Scholar
Fuster, J. M. (2003) Cortex and mind: Unifying cognition. Oxford University Press.Google Scholar
Gibson, J. J. (1979) The ecological approach to visual perception. Houghton Mifflin.Google Scholar
Godwin, C. A., Gazzaley, A. & Morsella, E. (2013) Homing in on the brain mechanisms linked to consciousness: Buffer of the perception-and-action interface. In: The unity of mind, brain and world: Current perspectives on a science of consciousness, ed. Pereira, A. Jr. & Lehmann, D., pp. 4376. Cambridge University Press.Google Scholar
Gold, J. I. & Shadlen, M. N. (2007) The neural basis of decision making. Annual Reviews of Neuroscience 30:535–74.Google Scholar
Goodale, M. & Milner, D. (2004) Sight unseen: An exploration of conscious and unconscious vision. Oxford University Press.Google Scholar
Goodhew, S. C., Dux, P. E., Lipp, O. V. & Visser, T. A. W. (2012) Understanding recovery from object substitution masking. Cognition 122:405–15.CrossRefGoogle ScholarPubMed
Gottfried, J. A. (2006) Smell: Central nervous processing. In: Taste and smell: An update, ed. Hummel, T. & Welge-Lüssen, A., pp. 4469. Karger.Google Scholar
Gottlieb, J. & Mazzoni, P. (2004) Neuroscience: Action, illusion, and perception. Science 303:317–18.Google Scholar
Gould, S. J. (1977) Ever since Darwin: Reflections in natural history. Norton.Google Scholar
Grafman, J. & Krueger, F. (2009) The prefrontal cortex stores structured event complexes that are the representational basis for cognitively derived actions. In: Oxford handbook of human action, ed. Morsella, E., Bargh, J. A. & Gollwitzer, P. M., pp. 197213. Oxford University Press.Google Scholar
Gray, J. A. (1995) The contents of consciousness: A neuropsychological conjecture. Behavioral and Brain Sciences 18:659–76.Google Scholar
Gray, J. A. (2004) Consciousness: Creeping up on the hard problem. Oxford University Press.Google Scholar
Gray, J. R., Bargh, J. A. & Morsella, E. (2013) Neural correlates of the essence of conscious conflict: fMRI of sustaining incompatible intentions. Experimental Brain Research 229:453–65.Google Scholar
Graziano, M. S. A. (2008) The intelligent movement machine: An ethological perspective on the primate motor system. Oxford University Press.Google Scholar
Graziano, M. S. A. (2010) God, soul, mind, brain: A neuroscientist's reflections on the spirit world. A LeapSci Book.Google Scholar
Greenwald, A. G. (1970) Sensory feedback mechanisms in performance control: With special reference to the ideomotor mechanism. Psychological Review 77:7399.Google Scholar
Greenwald, A. G. & Pratkanis, A. R. (1984) The self. In: Handbook of social cognition, ed. Wyer, R. S. & Srull, T. K., pp. 129–78. Erlbaum.Google Scholar
Grossberg, S. (1987) The adaptive brain, vol. 1. North Holland.Google Scholar
Grossberg, S. (1999) The link between brain learning, attention, and consciousness. Consciousness and Cognition 8(1):144.Google Scholar
Haberly, L. B. (1998) Olfactory cortex. In: The synaptic organization of the brain, 4th edition, ed. Shepherd, G. M., pp. 377416. Oxford University Press.Google Scholar
Haggard, P. (2005) Conscious intention and motor cognition. Trends in Cognitive Sciences 9:290–95.Google Scholar
Haggard, P. (2008) Human volition: Towards a neuroscience of will. Nature Reviews: Neuroscience 9(12):934–46. doi: 10.1038/nrn2497.Google Scholar
Haggard, P., Aschersleben, G., Gehrke, J. & Prinz, W. (2002a) Action, binding and awareness. In: Common mechanisms in perception and action: Attention and performance, vol. XIX, ed. Prinz, W. & Hommel, B., pp. 266–85. Oxford University Press.Google Scholar
Hallett, M. (2007) Volitional control of movement: The physiology of free will. Clinical Neurophysiology 117:1179–92.Google Scholar
Hameroff, S. (2010) The “conscious pilot” – dendritic synchrony moves through the brain to mediate consciousness. Journal of Biological Physics 36:7193.Google Scholar
Hamker, F. H. (2003) The reentry hypothesis: Linking eye movements to visual perception. Journal of Vision 11:808–16.Google Scholar
Harleß, E. (1861) Der Apparat des Willens [The apparatus of the will]. Zeitshrift für Philosophie und philosophische Kritik 38:499507.Google Scholar
Harley, T. A. (1993) Phonological activation of semantic competitors during lexical access in speech production. Language and Cognitive Processes 8:291309.Google Scholar
Heath, M., Neely, K. A., Yakimishyn, J. & Binsted, G. (2008) Visuomotor memory is independent of conscious awareness of target features. Experimental Brain Research 188:517–27.Google Scholar
Hebb, D. O. (1949) The organization of behavior: A neuropsychological theory. Wiley.Google Scholar
Helmholtz, H. von (1856/1961) Treatise of physiological optics: Concerning the perceptions in general. In: Classics in psychology, ed. Shipley, T., pp. 79127. Philosophy Library. (Original work published in 1856).Google Scholar
Herz, R. S. (2003) The effect of verbal context on olfactory perception. Journal of Experimental Psychology: General 132:595606.Google Scholar
Hesslow, G. (2002) Conscious thought as simulation of behavior and perception. Trends in Cognitive Sciences 6:242–47.Google Scholar
Ho, V. B., Fitz, C. R., Chuang, S. H. & Geyer, C. A. (1993) Bilateral basal ganglia lesions: Pediatric differential considerations. RadioGraphics 13:269–92.Google Scholar
Hommel, B. (2009) Action control according to TEC (theory of event coding). Psychological Research 73(4):512–26.Google Scholar
Hommel, B. (2013) Dancing in the dark: No role for consciousness in action control. Frontiers in Psychology 4, article 380. (Online journal). doi:10.3389/fpsyg.2013.00380.Google Scholar
Hommel, B. & Elsner, B. (2009) Acquisition, representation, and control of action. In: Oxford handbook of human action, ed. Morsella, E., Bargh, J. A. & Gollwitzer, P. M., pp. 371–98. Oxford University Press.Google Scholar
Hommel, B., Müsseler, J., Aschersleben, G. & Prinz, W. (2001) The theory of event coding: A framework for perception and action planning. Behavioral and Brain Sciences 24(6):849–37.Google Scholar
Hume, D. (1739/1888) A treatise on human nature, ed. Selby-Bigge, L. A.. Oxford University Press. (Original work published in 1739).Google Scholar
Hummel, F. & Gerloff, C. (2005) Larger interregional synchrony is associated with greater behavioral success in a complex sensory integration task in humans. Cerebral Cortex 15:670–78.Google Scholar
Iacoboni, M. (2005) Understanding others: Imitation, language and empathy. In: Perspectives on imitation: From mirror neurons to memes, ed. Hurley, S. & Chater, N., pp. 7799. MIT Press.Google Scholar
Iacoboni, M. & Dapretto, M. (2006) The mirror neuron system and the consequences of its dysfunction. Nature Reviews: Neuroscience 7:942–51.Google Scholar
Jackendoff, R. S. (1990) Consciousness and the computational mind. MIT Press.Google Scholar
Jackson, F. (1986) What Mary didn't know. T he Journal of Philosophy 83:291–95.Google Scholar
James, W. (1890) The principles of psychology, vols. 1 & 2. Holt/Dover.Google Scholar
Jeannerod, M. (2006) Motor cognition: What action tells the self. Oxford University Press.Google Scholar
Jeannerod, M. (2009) The sense of agency and its disturbances in schizophrenia: A reappraisal. Experimental Brain Research 196:527–32.Google Scholar
Johnson, H. & Haggard, P. (2005) Motor awareness without perceptual awareness. Neuropsychologia 43:227–37.Google Scholar
Joliot, M., Ribary, U. & Llinás, R. (1994) Human oscillatory brain activity near 40 Hz coexists with cognitive temporal binding. Proceedings of the National Academy of Sciences USA 91:11748–51.Google Scholar
Jordan, J. S. (2009) Forward-looking aspects of perception-action coupling as a basis for embodied communication. Discourse Processes 46:127–44.Google Scholar
Jung-Beeman, M., Bowden, E. M., Haberman, J., Frymiare, J. L., Arambel-Liu, S., Greenblatt, R., Reber, P. J. & Kounios, J. (2004) Neural activity when people solve verbal problems with insight. PLoS Biology 2:500–10.Google Scholar
Kaido, T., Otsuki, T., Nakama, H., Kaneko, Y., Kubota, Y., Sugai, K. & Saito, O. (2006) Complex behavioral automatism arising from insular cortex. Epilepsy and Behavior 8:315–19.Google Scholar
Kececi, H., Degirmenci, Y. & Gumus, H. (2013) Two foreign language automatisms in complex partial seizures. Epilepsy and Behavior Case Reports 1:79.Google Scholar
Keller, A. (2011) Attention and olfactory consciousness. Frontiers in Psychology 2, article 380. (Online journal). doi: 10.3389/fpsyg.2011.00380.Google Scholar
Kern, M. K., Jaradeh, S., Arndorfer, R. C. & Shaker, R. (2001) Cerebral cortical representation of reflexive and volitional swallowing in humans. American Journal of Physiology: Gastrointestinal and Liver Physiology 280:G354–60.Google Scholar
Kinsbourne, M. (1996) What qualifies a representation for a role in consciousness? In: Scientific approaches to consciousness, ed. Cohen, J. D. & Schooler, J. W., pp. 335–55. Erlbaum.Google Scholar
Kinsbourne, M. (2000) How is consciousness expressed in the cerebral activation manifold? Brain and Mind 2:265–74.Google Scholar
Klein, D. B. (1970) A history of scientific psychology: Its origins and philosophical backgrounds. Basic Books.Google Scholar
Klein, D. B. (1984) The concept of consciousness: A survey. University of Nebraska Press.Google Scholar
Koch, C. (2004) The quest for consciousness: A neurobiological approach. Roberts.Google Scholar
Koch, C. (2012) Consciousness: Confessions of a romantic reductionist. MIT Press.Google Scholar
Koch, C. & Greenfield, S. A. (2007) How does consciousness happen? Scientific American 297:7683.Google Scholar
Köhler, W. (1947) Gestalt psychology: An introduction to new concepts in modern psychology. Liveright.Google Scholar
Kokkinos, V., Zountsas, B., Kontogiannis, K. & Garganis, K. (2012) Epileptogenic networks in two patients with hypothalamic hamartoma. Brain Topography 25:327–31.Google Scholar
Kosslyn, S. M., Ganis, G. & Thompson, W. L. (2001) Neural foundations of imagery. Nature Reviews Neuroscience 2:635–42.Google Scholar
Kosslyn, S. M., Thompson, W. L. & Ganis, G. (2006) The case for mental imagery. Oxford University Press.Google Scholar
Kouider, S. & Dupoux, E. (2004) Partial awareness creates the “illusion” of subliminal semantic priming. Psychological Science 15:7581.Google Scholar
Krauzlis, R. J., Bollimunta, A., Arcizet, F. & Wang, L. (2014) Attention as an effect not a cause. Trends in Cognitive Sciences 18:457–64.Google Scholar
Kriegel, U. (2007) A cross-order integration hypothesis for the neural correlate of consciousness. Consciousness and Cognition 16:897912.Google Scholar
Kutlu, G., Bilir, E., Erdem, A., Gomceli, Y. B., Kurt, G. S. & Serdaroglu, A. (2005) Hush sign: A new clinical sign in temporal lobe epilepsy. Epilepsy and Behavior 6:452–55.Google Scholar
Lamme, V. A. (2001) Blindsight: The role of feedback and feedforward cortical connections. Acta Psychologica 107:209–28.Google Scholar
Lamme, V. A. F. & Spekreijse, H. (2000) Modulations of primary visual cortex activity representing attentive and conscious scene perception. Frontiers in Bioscience 5:D232–43.Google Scholar
Långsjö, J. W., Alkire, M. T., Kaskinoro, K., Hayama, H., Maksimow, A., Kaisti, K. K., Aalto, S., Aantaa, R., Jääskeläinen, S. K., Revonsuo, A. & Scheinin, H. (2012) Returning from oblivion: Imaging the neural core of consciousness. The Journal of Neuroscience 32:4935–43.Google Scholar
Lashley, K. S. (1942) The problem of cerebral organization in vision. In: Visual mechanisms: Biological symposia, vol. 7, ed. Kluver, H., pp. 301–22. Cattell Press.Google Scholar
Lashley, K. S. (1951) The problem of serial order in behavior. In: Cerebral mechanisms in behavior: The Hixon symposium, ed. Jeffress, L. A., pp. 112–46. Wiley.Google Scholar
Lashley, K. S. (1956) Cerebral organization and behavior. Proceedings of the Association for Research in Nervous and Mental Diseases 36:118.Google Scholar
Lau, H. C. (2008) A higher-order Bayesian decision theory of consciousness. Progress in Brain Research 168:3548.Google Scholar
Lau, H. C. (2009) Volition and the functions of consciousness. In: The Cognitive Neurosciences IV, ed. Gazzaniga, M., pp. 1191–200. MIT Press.Google Scholar
Lau, H. C., Rogers, R. D. & Passingham, R. E. (2007) Manipulating the experienced onset of intention after action execution. Journal of Cognitive Neuroscience 19:8190.Google Scholar
Laureys, S. (2005) The neural correlate of (un)awareness: Lessons from the vegetative state. Trends in Cognitive Sciences 12:556–59.Google Scholar
Lawless, H. T. (1997) Olfactory psychophysics. In: Tasting and Smelling: Handbook of Perception and Cognition, ed. Beauchamp, G. K. & Bartoshuk, L., pp. 125–74. Academic Press.Google Scholar
LeDoux, J. E. (1996) The emotional brain: The mysterious underpinnings of emotional life. Simon and Schuster.Google Scholar
Lee, U., Kim, S., Noh, G. J., Choi, B. M., Hwang, E. & Mashour, G. (2009) The directionality and functional organization of frontoparietal connectivity during consciousness and anesthesia in humans. Consciousness and Cognition 18:1069–78.Google Scholar
Lehar, S. (2003) Gestalt isomorphism and the primacy of the subjective conscious experience: A Gestalt bubble model. Behavioral and Brain Sciences 26:375–44.Google Scholar
Levelt, W. J. M. (1989) Speaking: From intention to articulation. MIT Press.Google Scholar
Lewin, K. (1935) A dynamic theory of personality. McGraw-Hill.Google Scholar
Lewis, L. D., Weiner, V. S., Mukamel, E. A., Donoghue, J. A., Eskandar, E. N., Madsen, J. R., Anderson, W., Hochberg, L. R., Cash, S. S., Brown, E. N. & Purdon, P. L. (2012) Rapid fragmentation of neuronal networks at the onset of propofol-induced unconsciousness. Proceedings of the National Academy of Sciences USA 109(49):E3377–86.Google Scholar
Lhermitte, F. (1983) “Utilization behavior” and its relation to lesions of the frontal lobes. Brain 106:237–55.Google Scholar
Li, W., Lopez, L., Osher, J., Howard, J. D., Parrish, T. B. & Gottfried, J. A. (2010) Right orbitofrontal cortex mediates conscious olfactory perception. Psychological Science 21:1454–63.Google Scholar
Libet, B. (2004) Mind time: The temporal factor in consciousness. Harvard University Press.Google Scholar
Lieberman, M. D. (2007) The X- and C-systems: The neural basis of automatic and controlled social cognition. In: Fundamentals of social neuroscience, ed. Harmon-Jones, E. & Winkielman, P., pp. 290315. Guilford.Google Scholar
Liu, G., Chua, R. & Enns, J. T. (2008) Attention for perception and action: Task interference for action planning, but not for online control. Experimental Brain Research 185:709–17.Google Scholar
Liu, Y., Paradis, A.-L., Yahia-Cherif, L. & Tallon-Baudry, C. (2012) Activity in the lateral occipital cortex between 200 and 300 ms distinguishes between physically identical seen and unseen stimuli. Frontiers in Human Neuroscience 6, article 211. (Online journal).Google Scholar
Llinás, R. R. & Ribary, U. (2001) Consciousness and the brain: The thalamocortical dialogue in health and disease. Annals of the New York Academy of Sciences 929:166–75.Google Scholar
Llinás, R., Ribary, U., Contreras, D. & Pedroarena, C. (1998) The neuronal basis for consciousness. Philosophical Transactions of the Royal Society of London: B 353:1841–49.Google Scholar
Logan, G. D., Yamaguchi, M., Schall, J. D. & Palmeri, T. J. (2015) Inhibitory control in mind and brain 2.0: Blocked-input models of saccadic countermanding. Psychological Review 122:115–47.Google Scholar
Logothetis, N. K. & Schall, J. D. (1989) Neuronal correlates of subjective visual perception. Science 245:761–62.Google Scholar
Lorenz, K. (1963) On aggression. Harcourt, Brace & World.Google Scholar
Lotze, R. H. (1852) Medizinische Psychologie oder Physiologie der Seele. Weidmann'sche Buchhandlung.Google Scholar
Lucas, M. (2000) Semantic priming without association: A meta-analytic review. Psychonomic Bulletin and Review 7:618–30.Google Scholar
Macphail, E. M. (1998) The evolution of consciousness. Oxford University Press.Google Scholar
Mainland, J. D. & Sobel, N. (2006) The sniff is part of the olfactory percept. Chemical Senses 31:181–96.Google Scholar
Mak, Y. E., Simmons, K. B., Gitelman, D. R. & Small, D. M. (2005) Taste and olfactory intensity perception changes following left insular stroke. Behavioral Neuroscience 119:1693–700.Google Scholar
Marchetti, C. & Della Sala, S. (1998) Disentangling the alien and anarchic hand. Cognitive Neuropsychiatry 3:191207.Google Scholar
Marcus, G. (2008) Kluge: The haphazard construction of the mind. Houghton Mifflin.Google Scholar
Markman, A. B. (1999) Knowledge representation. Erlbaum.Google Scholar
Markowitsch, H. J. (1982) Thalamic mediodorsal nucleus and memory: A critical evaluation of studies in animals and man. Neuroscience and Biobehavioral Reviews 6:351–80.Google Scholar
Martin, G. N. (2013) The neuropsychology of smell and taste. Psychology Press.Google Scholar
Masicampo, E. J. & Baumeister, R. F. (2013) Conscious thought does not guide moment-to-moment actions – it serves social and cultural functions. Frontiers in Psychology 4, article 478. (Online journal). doi:10.3389/fpsyg.2013.00478.Google Scholar
Mathalon, D. H. & Ford, J. M. (2008) Corollary discharge dysfunction in schizophrenia: Evidence for an elemental deficit. Clinical EEG and Neuroscience 39:8286.Google Scholar
Mattler, U. (2005) Flanker effects on motor output and the late-level response activation hypothesis. Quarterly Journal of Experimental Psychology 58A:577601.Google Scholar
McClelland, J. L. (1979) On the time-relations of mental processes: An examination of systems of processes in cascade. Psychological Review 86:287–30.Google Scholar
McFarland, D. J. & Sibly, R. M. (1975) The behavioural final common path. Philosophical Transactions of the Royal Society (London) 270:265–93.Google Scholar
McGurk, H. & MacDonald, J. (1976) Hearing lips and seeing voices. Nature 264:746–48.Google Scholar
McKay, L. C., Evans, K. C., Frackowiak, R. S. J. & Corfield, D. R. (2003) Neural correlates of voluntary breathing in humans determined using functional magnetic resonance imaging. Journal of Applied Physiology 95:1170–78.Google Scholar
Meador, K. J., Ray, P. G., Echauz, J. R., Loring, D. W. & Vachtsevanos, G. J. (2002) Gamma coherence and conscious perception. Neurology 59:847–54.Google Scholar
Meier, B. P., Robinson, M. D., Crawford, L. E. & Ahlvers, W. J. (2007) When “light” and “dark” thoughts become light and dark responses: Affect biases brightness judgments. Emotion 7:366–76.Google Scholar
Melcher, T., Weidema, M., Eenshuistra, R. M., Hommel, B. & Gruber, O. (2008) The neural substrate of the ideomotor principle: An event-related fMRI analysis. NeuroImage 39:1274–88.Google Scholar
Melcher, T., Winter, D., Hommel, B., Pfister, R., Dechent, P. & Gruber, O. (2013) The neural substrate of the ideomotor principle revisited: Evidence for asymmetries in action-effect learning. Neuroscience 231:1327.Google Scholar
Merker, B. (2007) Consciousness without a cerebral cortex: A challenge for neuroscience and medicine. Behavioral and Brain Sciences 30(1):6381; discussion 81–134.Google Scholar
Merker, B. (2012) From probabilities to percepts: A subcortical “global best estimate buffer” as locus of phenomenal experience. In: Being in time: Dynamical models of phenomenal experience, ed. Shimon, E., Tomer, F. & Zach, N., pp. 3780. John Benjamins.Google Scholar
Merker, B. (2013b) Cortical gamma oscillations: The functional key is activation, not cognition. Neuroscience and Biobehavioral Reviews 37:401–17.Google Scholar
Merker, B. (2013c) The efference cascade, consciousness, and its self: Naturalizing the first person pivot of action control. Frontiers in Psychology 4, article 501:120. (Online journal). doi:10.3389/fpsyg.2013.00501.Google Scholar
Merrick, M. C., Godwin, C. A., Geisler, M. W. & Morsella, E. (2014) The olfactory system as the gateway to the neural correlates of consciousness. Frontiers in Psychology 4, article1011. (Online journal). doi:10.3389/fpsyg.2013.01011.Google Scholar
Metcalfe, J. & Mischel, W. (1999) A hot/cool-system analysis of delay of gratification: Dynamics of willpower. Psychological Review 106:319.Google Scholar
Metzinger, T. (2000) Neural correlates of consciousness. MIT Press.Google Scholar
Miall, R. C. (2003) Connecting mirror neuron and forward models. Neuroreport 14:13.Google Scholar
Miller, E. K. (2000) The prefrontal cortex and cognitive control. Nature Reviews Neuroscience 1:5965.Google Scholar
Miller, G. A. (1962) Psychology: The science of mental life. Adams, Bannister, & Cox.Google Scholar
Miller, N. E. (1959) Liberalization of basic S-R concepts: Extensions to conflict behavior, motivation, and social learning. In: Psychology: A study of a science, vol. 2, ed. Koch, S., pp. 196292. McGraw-Hill.Google Scholar
Milner, A. D. & Goodale, M. (1995) The visual brain in action. Oxford University Press.Google Scholar
Milner, B. (1966) Amnesia following operation on the temporal lobes. In: Amnesia, ed. Whitty, C. W. M. & Zangwill, O. L., pp. 109–33. Butterworths.Google Scholar
Mitchell, A. S., Baxter, M. G. & Gaffan, D. (2007) Dissociable performance on scene learning and strategy implementation after lesions to magnocellular mediodorsal thalamic nucleus. The Journal of Neuroscience 27:11888–95.Google Scholar
Mizobuchi, M., Ito, N., Tanaka, C., Sako, K., Sumi, Y. & Sasaki, T. (1999) Unidirectional olfactory hallucination associated with ipsilateral unruptured intracranial aneurysm. Epilepsia 40:516–19.Google Scholar
Molapour, T., Berger, C. C. & Morsella, E. (2011) Did I read or did I name? Process blindness from congruent processing “outputs.Consciousness and Cognition 20:1776–80.Google Scholar
Morsella, E. (2005) The function of phenomenal states: Supramodular interaction theory. Psychological Review 112:1000–21.Google Scholar
Morsella, E. & Bargh, J. A. (2007) Supracortical consciousness: Insights from temporal dynamics, processing-content, and olfaction. Behavioral and Brain Sciences 30:100.Google Scholar
Morsella, E. & Bargh, J. A. (2010a) Unconscious mind. In: The Corsini encyclopedia of psychology and behavioral science, vol. 4, 4th edition, ed. Weiner, I. B. & Craighead, W. E., pp. 1817–19. Wiley.Google Scholar
Morsella, E. & Bargh, J. A. (2010b) What is an output? Psychological Inquiry 21:354–70.Google Scholar
Morsella, E. & Bargh, J. A. (2011) Unconscious action tendencies: Sources of “un-integrated” action. In: The handbook of social neuroscience, ed. Cacioppo, J. T. & Decety, J., pp. 335–47. Oxford University Press.Google Scholar
Morsella, E., Berger, C. C. & Krieger, S. C. (2011) Cognitive and neural components of the phenomenology of agency. Neurocase 17:209–30.Google Scholar
Morsella, E., Gray, J. R., Krieger, S. C. & Bargh, J. A. (2009a) The essence of conscious conflict: Subjective effects of sustaining incompatible intentions. Emotion 9:717–28.Google Scholar
Morsella, E., Krieger, S. C. & Bargh, J. A. (2010) Minimal neuroanatomy for a conscious brain: Homing in on the networks constituting consciousness. Neural Networks 23:1415.Google Scholar
Morsella, E., Lanska, M., Berger, C. C. & Gazzaley, A. (2009b) Indirect cognitive control through top-down activation of perceptual symbols. European Journal of Social Psychology 39:1173–77.Google Scholar
Morsella, E. & Poehlman, T. A. (2013) The inevitable contrast: Conscious versus unconscious processes in action control. Frontiers in Psychology 4, article 590. (Online journal). doi:10.3889/fpsyg.2013.00590.Google Scholar
Morsella, E., Wilson, L. E., Berger, C. C., Honhongva, M., Gazzaley, A. & Bargh, J. A. (2009c) Subjective aspects of cognitive control at different stages of processing. Attention, Perception and Psychophysics 71:1807–24.Google Scholar
Mudrik, L., Faivre, N. & Koch, C. (2014) Information integration without awareness. Trends in Cognitive Sciences 18(9):488–96.Google Scholar
Müller, J. (1843) Elements of physiology. Lea and Blanchard.Google Scholar
Muzur, A., Pace-Schott, E. F. & Hobson, J. A. (2002) The prefrontal cortex in sleep. Trends in Cognitive Sciences 6:475–81.Google Scholar
Obhi, S., Planetta, P. & Scantlebury, J. (2009) On the signals underlying conscious awareness of action. Cognition 110:6573.Google Scholar
Öhman, A., Carlsson, K., Lundqvist, D. & Ingvar, M. (2007) On the unconscious subcortical origin of human fear. Physiology and Behavior 92:180–85.Google Scholar
Öhman, A. & Mineka, S. (2001) Fears, phobias, and preparedness: Toward an evolved module of fear and fear learning. Psychological Review 108:483522.Google Scholar
Ojemann, G. (1986) Brain mechanisms for consciousness and conscious experience. Canadian Psychology 27:158–68.Google Scholar
Olsson, A. & Phelps, E. A. (2004) Learned fear of “unseen” faces after Pavlovian, observational, and instructed fear. Psychological Science 15:822–28.Google Scholar
Ortinski, P. & Meador, K. J. (2004) Neuronal mechanisms of conscious awareness. Archives of Neurology 61:1017–20.Google Scholar
O'Shea, R. P. & Corballis, P. M. (2005) Visual grouping on binocular rivalry in a split-brain observer. Vision Research 45:247–61.Google Scholar
Pacherie, E. (2008) The phenomenology of action: A conceptual framework. Cognition 107:179217.Google Scholar
Panagiotaropolous, T. I., Deco, G., Kapoor, V. & Logothetis, N. K. (2012) Neuronal discharges and gamma oscillations explicitly reflect visual consciousness in the lateral prefrontal cortex. Neuron 74:924–35.Google Scholar
Panagiotaropoulos, T. I., Kapoor, V. & Logothetis, N. K. (2013) Desynchronization and rebound of beta oscillations during conscious and unconscious local neuronal processing in the macaque lateral prefrontal cortex. Frontiers in Psychology 4, article 603. (Online journal). doi:10.3389/fpsyg.2013.00603.Google Scholar
Panksepp, J. (1998) Affective neuroscience: The foundations of human and animal emotions. Oxford University Press.Google Scholar
Panksepp, J. (2007) Emotional feelings originate below the neocortex: Toward a biology of the soul. Behavioral and Brain Sciences 30:101103.Google Scholar
Pascual-Leone, A. & Walsh, V. (2001) Fast backprojections from the motion to the primary visual area are necessary for visual awareness. Science 292:510–12.Google Scholar
Penfield, W. & Jasper, H. H. (1954) Epilepsy and the functional anatomy of the human brain. Little, Brown.Google Scholar
Penfield, W. & Roberts, L. (1959) Speech and brain mechanisms. Princeton University Press.Google Scholar
Philippi, C. L., Feinstein, J. S., Khalsa, S. S., Damasio, A., Tranel, D., Landini, G., Williford, K. & Rudrauf, D. (2012) Preserved self-awareness following extensive bilateral brain damage to the insula, anterior cingulate, and medial prefrontal cortices. PLoS ONE 7:e38413.Google Scholar
Pilon, M. & Sullivan, S. J. (1996) Motor profile of patients in minimally responsive and persistent vegetative states. Brain Injury 10:421–37.Google Scholar
Pinker, S. (1997) How the mind works. Norton.Google Scholar
Plailly, J., Howard, J. D., Gitelman, D. R. & Gottfried, J. A. (2008) Attention to odor modulates thalamocortical connectivity in the human brain. Journal of Neuroscience 28:5257–67. doi:10.1523/JNEUROSCI.5607-07.2008.Google Scholar
Poehlman, T. A., Jantz, T. K. & Morsella, E. (2012) Adaptive skeletal muscle action requires anticipation and “conscious broadcasting.Frontiers in Cognition 3, article 369. (Online journal). doi: 10.3389/fpsyg.2012.00369.Google Scholar
Postle, B. R. (2009) The hippocampus, memory, and consciousness. In: The neurology of consciousness: Cognitive neuroscience and neuropathology, ed. Laureys, S. & Tononi, G., pp. 326–38. Academic Press.Google Scholar
Price, J. L. (1985) Beyond the primary olfactory cortex: Olfactory-related areas in the neocortex, thalamus, and hypothalamus. Chemical Senses 10:239–85.Google Scholar
Price, J. L., Carmichael, S. T., Carnes, K. M., Clugnet, M.-C., Kuroda, M. & Ray, J. P. (1991) Olfactory input to the prefrontal cortex. In: Olfaction: A model system for computational neuroscience, ed. Davis, J. L. & Eichenbaum, H., pp. 101–20. MIT Press.Google Scholar
Prinz, J. (2007) The intermediate level theory of consciousness. In: The Blackwell companion to consciousness, ed. Velmans, M. & Schneider, S., pp. 248–60. Blackwell.Google Scholar
Prinz, W. (2003b) How do we know about our own actions? In: Voluntary action: Brains, minds, and sociality, ed. Maasen, S., Prinz, W. & Roth, G., pp. 2133. Oxford University Press.Google Scholar
Prinz, W. (2012) Open minds: The social making of agency and intentionality. MIT Press.Google Scholar
Prinz, W., Aschersleben, G. & Koch, I. (2009) Cognition and action. In: Oxford handbook of human action, ed. Morsella, E., Bargh, J. A. & Gollwitzer, P. M., pp. 3571. Oxford University Press.Google Scholar
Proctor, R. W. & Vu, K.-P. L. (2010) Action selection. In: The Corsini encyclopedia of psychology, vol. 1, ed. Weiner, I. B. & Craighead, E., pp. 2022. Wiley.Google Scholar
Pylyshyn, Z. W. (1984) Computation and cognition: Toward a foundation for cognitive science. MIT Press.Google Scholar
Pylyshyn, Z. W. (1999) Is vision continuous with cognition? The case for cognitive impenetrability of visual perception. Behavioral and Brain Sciences 22:341423.Google Scholar
Rakison, D. H. & Derringer, J. L. (2008) Do infants possess an evolved spider-detection mechanism? Cognition 107:381–93.Google Scholar
Rizzolatti, G., Fogassi, L. & Gallese, V. (2004) Cortical mechanisms subserving object grasping, action understanding, and imitation. In: The cognitive neurosciences III, ed. Gazzaniga, M. S., pp. 427–40. MIT Press.Google Scholar
Rizzolatti, G., Sinigaglia, C. & Anderson, F. (2008) Mirrors in the brain: How our minds share actions, emotions, and experience. Oxford University Press.Google Scholar
Robertson, L. C. (2003) Binding, spatial attention and perceptual awareness. Nature Reviews: Neuroscience 4:93102.Google Scholar
Roe, A. & Simpson, G. G. (1958) Behavior and evolution. Yale University Press.Google Scholar
Rolls, E. T., Judge, S. J. & Sanghera, M. (1977) Activity of neurons in the inferotemporal cortex of the alert monkey. Brain Research 130:229–38.Google Scholar
Rolls, E. T. & Treves, A. (1998) Neural networks and brain function. Oxford University Press.Google Scholar
Rosenbaum, D. A. (2005) The Cinderella of psychology: The neglect of motor control in the science of mental life and behavior. American Psychologist 60:308–17.Google Scholar
Rosenbaum, D. A. (2002) Motor control. In: Stevens' handbook of experimental psychology: Vol. 1. Sensation and perception, 3rd edition, ed. Yantis, S., pp. 315–39. [Series editor: Yantis, S., pp. Wiley.Google Scholar
Roser, M. & Gazzaniga, M. S. (2004) Automatic brains—interpretive minds. Current Directions in Psychological Science 13:5659.Google Scholar
Rossetti, Y. (2001) Implicit perception in action: Short-lived motor representation of space. In: Finding consciousness in the brain: A neurocognitive approach, ed. Grossenbacher, P. G., pp. 133–81. John Benjamins.Google Scholar
Safavi, S., Kapoor, V., Logothetis, N. K. & Panagiotaropoulos, T. I. (2014) Is the frontal lobe involved in conscious perception? Frontiers in Psychology 5, article 1063. (Online journal). doi:10.3389/psyg.2014.01063.Google Scholar
Schacter, D. L. (1996) Searching for memory: The brain, the mind, and the past. Basic Books.Google Scholar
Schmahmann, J. D. (1998) Dysmetria of thought: Clinical consequences of cerebellar dysfunction on cognition and affect. Trends in Cognitive Sciences 2:362–71.Google Scholar
Schoenbaum, G. & Eichenbaum, H. (1995) Information coding in the rodent prefrontal cortex: I. Single-neuron activity in orbitofrontal cortex compared with that in pyriform cortex. Journal of Neurophysiology 74:733–50.Google Scholar
Scholl, B. J. (2001) Objects and attention: The state of the art. Cognition 80:917.Google Scholar
Schopenhauer, A. (1818/1819) The world as will and representation, vol. 1. Dover. (Original work published in 1818).Google Scholar
Schroter, M., Spoormaker, V., Schorer, A., Wohlschlager, A., Czish, M., Kochs, E., Zimmer, C., Hemmer, B., Schneider, G., Jordan, D. & Ilg, R. (2012) Spatiotemporal reconfiguration of large-scale brain functional networks during propofol-induced loss of consciousness. Journal of Neuroscience 32:12832–40.Google Scholar
Schrouff, J., Perlbarg, V., Boly, M., Marrelec, G., Boveroux, P., Vanhaudenhuyse, A., Bruno, M. A., Laureys, S., Phillips, C., Pélégrini-Isaac, M., Maquet, P. & Benali, H. (2011) Brain functional integration decreases during propofol-induced loss of consciousness. NeuroImage 57:198205.Google Scholar
Schubert, R., Blankenburg, F., Lemm, S., Villringer, A. & Curio, G. (2006) Now you feel it – now you don't: ERP correlates of somatosensory awareness. Psychophysiology 43:3140.Google Scholar
Scott, M. (2013) Corollary discharge provides the sensory content of inner speech. Psychological Science 24:1824–30.Google Scholar
Searle, J. R. (2000) Consciousness. Annual Review of Neurosciences 23:557–78.Google Scholar
Sela, L., Sacher, Y., Serfaty, C., Yeshurun, Y., Soroker, N. & Sobel, N. (2009) Spared and impaired olfactory abilities after thalamic lesions. Journal of Neuroscience 29:12059–69.Google Scholar
Sergent, C. & Dahaene, S. (2004) Is consciousness a gradual phenomenon? Evidence for an all-or-none bifurcation during the attentional blink. Psychological Science 15:720–28.Google Scholar
Seth, A. K. (2007) The functional utility of consciousness depends on content as well as on state. Behavioral and Brain Sciences 30(1):106.Google Scholar
Sheerer, E. (1984) Motor theories of cognitive structure: A historical review. In: Cognition and motor processes, ed. Prinz, W. & Sanders, A. F., pp. 7798. Springer-Verlag.Google Scholar
Shepherd, G. M. (2007) Perspectives on olfactory processing, conscious perception, and orbitofrontal cortex. Annals of the New York Academy of Sciences 1121:87101.Google Scholar
Shepherd, G. M. & Greer, C. A. (1998) Olfactory bulb. In: The synaptic organization of the brain, 4th edition, ed. Shepherd, G. M., pp. 159204. Oxford University Press.Google Scholar
Sherman, S. M. & Guillery, R. W. (2006) Exploring the thalamus and its role in cortical function. MIT Press.Google Scholar
Sherrington, C. S. (1906) The integrative action of the nervous system. Yale University Press.Google Scholar
Simpson, G. G. (1949) The meaning of evolution. Yale University Press.Google Scholar
Singer, W. (2011) Consciousness and neuronal synchronization. In: The neurology of consciousness, ed. Laureys, S. & Tononi, G., pp. 4352. Academic Press.Google Scholar
Skinner, B. F. (1953) Science and human behavior. Macmillan.Google Scholar
Slevc, L. R. & Ferreira, V. S. (2006) Halting in single word production: A test of the perceptual loop theory of speech monitoring. Journal of Memory and Language 54:515–40.Google Scholar
Slotnick, B. M. & Risser, J. M. (1990) Odor memory and odor learning in rats with lesions of the lateral olfactory tract and mediodorsal thalamic nucleus. Brain Research 529:2329.Google Scholar
Spear, N. E. & Campbell, B. A. (1979) Ontogeny of learning and memory. Erlbaum.Google Scholar
Sperry, R. W. (1952) Neurology and the mind-brain problem. American Scientist 40:291312.Google Scholar
Squire, L. R. (1987) Memory and brain. Oxford University Press.Google Scholar
Srinivasan, R., Russell, D. P., Edelman, G. M. & Tononi, G. (1999) Increased synchronization of neuromagnetic responses during conscious perception. The Journal of Neuroscience 19:5435–48.Google Scholar
Stefanucci, J. K. & Geuss, M. N. (2009) Big people, little world: The body influences size perception. Perception 38:1782–95.Google Scholar
Stevenson, R. J. (2009) Phenomenal and access consciousness in olfaction. Consciousness and Cognition 18:1004–17. doi:10.1016/j.concog.2009.09.005.Google Scholar
Stroop, J. R. (1935) Studies of interference in serial verbal reactions. Journal of Experimental Psychology 18:643–62.Google Scholar
Stuss, D. T. & Anderson, V. (2004) The frontal lobes and theory of mind: Developmental concepts from adult focal lesion research. Brain & Cognition 55:6983.Google Scholar
Suhler, C. L. & Churchland, P. S. (2009) Control: Conscious and otherwise. Trends in Cognitive Sciences 13:341–47.Google Scholar
Suzuki, T., Itoh, S., Arai, N., Kouno, M., Noguchi, M., Takatsu, M. & Takeda, K. (2012) Ambient echolalia in a patient with germinoma around the bilateral ventriculus lateralis: A case report. Neurocase 18:330–35.Google Scholar
Synofzik, M., Vosgerau, G. & Newen, A. (2008) I move, therefore I am: A new theoretical framework to investigate agency and ownership. Consciousness and Cognition 17:411–24.Google Scholar
Tallon-Baudry, C. (2012) On the neural mechanisms subserving attention and consciousness. Frontiers in Psychology 2, article 397. (Online journal). doi: 10.3389/fpsyg.2011.00397.Google Scholar
Tanaka, Y., Miyazawa, Y., Akaoka, F. & Yamada, T. (1997) Amnesia following damage to the mammillary bodies. Neurology 48:160–65.Google Scholar
Taylor, J. A. & Ivry, R. B. (2013) Implicit and explicit processes in motor learning. In: Action science, ed. Prinz, W., Beisert, M. & Herwig, A., pp. 6387. MIT Press.Google Scholar
Taylor, J. L. & McCloskey, D. I. (1990) Triggering of preprogrammed movements as reactions to masked stimuli. Journal of Neurophysiology 63:439–46.Google Scholar
Taylor, J. L. & McCloskey, D. I. (1996) Selection of motor responses on the basis of unperceived stimuli. Experimental Brain Research 110:6266.Google Scholar
Thagard, P. & Stewart, T. C. (2014) Two theories of consciousness: Semantic pointer competition vs. information integration. Consciousness and Cognition 30:7390.Google Scholar
Tham, W. W. P., Stevenson, R. J. & Miller, L. A. (2009) The functional role of the medio dorsal thalamic nucleus in olfaction. Brain Research Reviews 62:109–26.Google Scholar
Tham, W. W. P., Stevenson, R. J. & Miller, L. A. (2011) The role of the mediodorsal thalamic nucleus in human olfaction. Neurocase 17:148–59.Google Scholar
Tong, F. (2003) Primary visual cortex and visual awareness. Nature Reviews Neuroscience 4:219–29.Google Scholar
Tononi, G. (2012) Phi: A voyage from the brain to the soul. Pantheon.Google Scholar
Tononi, G. & Edelman, G. M. (1998) Consciousness and complexity. Science 282(5395):1846–51.Google Scholar
Tranel, D. & Welsh-Bohmer, K. A. (2012) Pervasive olfactory impairment after bilateral limbic system destruction. Journal of Clinical and Experimental Neuropsychology 34:117–25.Google Scholar
Tsushima, Y., Sasaki, Y. & Watanabe, T. (2006) Greater disruption due to failure of inhibitory control on an ambiguous distractor. Science 314:1786–88.Google Scholar
Tye, M. (1999) Phenomenal consciousness: The explanatory gap as cognitive illusion. Mind 108:705–25.Google Scholar
Uhlhaas, P. J., Pipa, G., Lima, B., Melloni, L., Neuenschwander, S., Nikolic, D. & Singer, W. (2009) Neural synchrony in cortical networks: History, concept and current status. Frontiers in Integrative Neuroscience 3, article 17. (Online journal). doi:10.3389/neuro.07.017.2009.Google Scholar
van Gaal, S., Ridderinkhof, K. R., Fahrenfort, J. J., Scholte, H. S. & Lamme, V. A. F. (2008) Frontal cortex mediates unconsciously triggered inhibitory control. Journal of Neuroscience 28:8053–62.Google Scholar
van Veen, V., Cohen, J. D., Botvinick, M. M., Stenger, V. A. & Carter, C. C. (2001) Anterior cingulate cortex, conflict monitoring, and levels of processing. Neuroimage 14:1302–308.Google Scholar
Varela, F., Lachaux, J. P., Rodriguez, E. & Martinerie, J. (2001) The brainweb: Phase synchronization and large-scale integration. National Review of Neuroscience 2:229–39.Google Scholar
Velly, L., Rey, M., Bruder, N., Gouvitsos, F., Witjas, T., Regis, J. M., Peragut, J. C. & Gouin, F. (2007) Differential dynamic of action on cortical and subcortical structures of anesthetic agents during induction of anesthesia. Anesthesiology 107:202–12.Google Scholar
Velmans, M. (1991) Is human information processing conscious? Behavioral and Brain Sciences 14(4):651–69.Google Scholar
Vierkant, T. (2013) Managerial control and free mental agency. In: Decomposing the will, ed. Clark, A., Kiverstein, J. & Vierkant, T., pp. 285–97. Oxford University Press.Google Scholar
Voss, M. (2011) Not the mystery it used to be: Theme program: Consciousness. APS Observer 24(6), July/August 2011 edition. Available at: http://www.psychologicalscience.org Google Scholar
Vroomen, J. & de Gelder, B. (2003) Visual motion influences the contingent auditory motion aftereffect. Psychological Science 14:357–61.Google Scholar
Ward, L. M. (2011) The thalamic dynamic core theory of conscious experience. Consciousness and Cognition 20:464–86.Google Scholar
Wegner, D. M. (2002) The illusion of conscious will. MIT Press.Google Scholar
Weiskrantz, L. (1992) Unconscious vision: The strange phenomenon of blindsight. The Sciences 35:2328.Google Scholar
Weiskrantz, L. (1997) Consciousness lost and found: A neuropsychological exploration. Oxford University Press.Google Scholar
Wessel, J. R., Haider, H. & Rose, M. (2012) The transition from implicit to explicit representations in incidental learning situations: More evidence from high-frequency EEG coupling. Experimental Brain Research 217:153–62.Google Scholar
Westwood, D. A. (2009) The visual control of object manipulation. In: Oxford handbook of human action, ed. Morsella, E., Bargh, J. A. & Gollwitzer, P. M., pp. 88103. Oxford University Press.Google Scholar
Wolford, G., Miller, M. B. & Gazzaniga, M. S. (2004) Split decisions. In: The cognitive neurosciences III, ed. Gazzaniga, M. S., pp. 1189–99. MIT Press.Google Scholar
Woodworth, R. S. (1915) A revision of imageless thought. Psychological Review 22:127.Google Scholar
Wundt, W. (1902/1904) Principles of physiological psychology, trans. Titchener, E. B.. Sonnenschein. (Original work published in 1902; English translation from the 5th German edition [1904]).Google Scholar
Yamadori, A. (1997) Body awareness and its disorders. In: Cognition, computation, and consciousness, ed. Ito, M., Miyashita, Y. & Rolls, E. T., pp. 169–76. American Psychological Association.Google Scholar
Yates, J. (1985) The content of awareness is a model of the world. Psychological Review 92:249–84.Google Scholar
Zatorre, R. J. & Jones-Gotman, M. (1991) Human olfactory discrimination after unilateral frontal or temporal lobectomy. Brain 114:7184.Google Scholar
Zeki, S. & Bartels, A. (1999) Toward a theory of visual consciousness. Consciousness and Cognition 8:225–59.Google Scholar
Zhang, W. & Rosenbaum, D. A. (2008) Planning for manual positioning: The end-state comfort effect for abduction-adduction of the hand. Experimental Brain Research 184:383–89.Google Scholar
Figure 0

Figure 1. The divisions of the nervous system and place of consciousness within the system (based on Poehlman et al. 2012), including the major divisions of the Somatic and Autonomic systems. Afference binding within systems can be unconscious. Although response systems can influence action directly, as in the case of unintegrated actions, only in virtue of consciousness can multiple response systems influence action collectively, as when one holds one's breath while underwater.