Document Type



Master of Arts



Date of Defense


Graduate Advisor

Berit Brogaard, Ph.D.


Gualtiero Piccinini

John Brunero


In this paper, I argue that all computational theories of consciousness (CTCs) fail. CTCs hold that the right kind of computation is sufficient for the instantiation of consciousness. Given the widely-recognized importance of his work, I will use David J. Chalmers’ Thesis of Computational Sufficiency as a paradigm case. I will argue that it fails for a reason that can be formalized as a general problem plaguing any CTC: the medium-independent properties (MIPs) constitutive of computational processes are insufficient to instantiate the medium-dependent properties (MDPs) constitutive of consciousness. MIPs, like graphemes, are properties whose causal role (e.g., symbolic meaning) does not depend upon the physical properties of the vehicles by which the relevant information is transferred (e.g., paper), while MDPs, like digestion, have causal roles (e.g., decomposition) that directly depend on physical properties of the relevant vehicles (e.g., enzymes). Since computations, as abstract descriptions, are MIPs, they must be implemented to generate MDPs. However, this makes potential implementation properties central to the feasibility of instantiating consciousness in artificial systems. The problems that arise for CTC advocates are two-fold, taxonomical and empirical. The taxonomic problem is that adverting to detailed physio-causal properties of implementation vehicles threatens to subvert the legitimacy of calling such theories “computational.” The empirical problem is the following. Given the necessary role of implementation properties, and the fact that functions supervene on structures, it follows that physical differences can legislate mental differences. After distinguishing weaker and stronger varieties of implementation requirements, and showing why a plausible CTC requires a thick theory of implementation, I will examine the implementation requirements for human consciousness. Given empirical data suggesting that consciousness depends on very specific physical properties of the brain, for which there are no known implementation surrogates, I argue that CTCs will fail to generate the relevant MDPs. I will conclude by showing why this implies that the HMSMRST is almost certainly false.