Monthly Archives: February 2015

Revisiting Chomsky’s Critique of the Mind-Body Problem

It has been almost a year after I finished my Chomsky paper under Paul Pietroski’s supervision. I didn’t think very much about my paper until David King read and then praised it as an excellent and novel paper; he encouraged me to submit it to a peer-review journal. So, I tried to submit it to Mind & Language and Philosophical Papers (not at the same time), but they rejected it. I submitted it to Philosophia, but I have yet to hear from its editor’s reply. Since it is very difficult to get one’s paper published in any philosophy journal, I’m not very optimistic about my paper’s publication. Moreover, I only cited five sources in my entire paper, indicating that I’m not engaging with enough contemporary literature. I could have used a lot of feedback from a lot of people, but instead I only got a few feedbacks from Paul Pietroski and David King.

While I deeply appreciate David King and Paul Pietroski’s support, I do think that I should have many some big changes to my paper. For one, I wish I included Ray Jackendoff’s distinctions between mind-brain problem, brain-computational problem, and mind-computational problem from his book Consciousness and the Computational Mind (1987). Jackendoff thought there were many layers to the mind-body problem. First, there’s the brain-computation problem that asks how the brain implements computational processes. Second, there’s the mind-computational problem that asks how the mind relates to computational processes. Lastly, there’s the mind-brain problem that asks how the mind is realized in the brain via computational processes. If the first two problems are resolved, then the last problem could be resolved too. How is this relevant to my attempts to reformulate the mind-body problem? I think the mind-body problem can be reformulated in terms of the “mind-computation” problem according to Jackendoff’s terminology. Instead of having a very broad “mind-body problem” we can reformulate more specifically to the “mind-computation problem”.

By using Ray Jackendoff’s distinctions to re-formulate the mind-body problem, I could have argued that the reformulated mind-computation problem could avoid Chomsky’s critique. Here’s a recap of Chomsky’s critique: Chomsky denies that there’s such thing as a real mind-body problem discussed by philosophers because nobody has defined exactly what we mean by the “body” (Chomsky, 2000). Whereas Descartes’ Mechanical Philosophy already defined the physical in terms of a mechanistic automaton, contemporary Physicalism lacks a coherent definition of the physical.

My original solution was to argue that we can reformulate the mind-body problem in terms of the mind-computation problem within the framework of the Computational Theory of Mind, specifically Fodor’s Language of Thought or Classical Computationalism. Under this framework, we would come across another philosophical problem that Fodor worries about: the problem of abduction or global properties (Fodor, 2000; the mind doesn’t work that way). However, I currently think this is somewhat misguided, because there are many different competing definitions about “computation” in the context of philosophy of mind and cognitive science. Gualtiero Piccinini presents many different computational theories of mind (Piccinini, 2009) , including his own called the Mechanistic view (Piccinini, 2008).

In the light of Piccinini’s work, I should have pointed out that there are many different competing definitions of “computation” in the context of philosophy of mind/cognitive science that could potentially explain the mind. Instead of trying to define the “body” or “physical”, a very challenging task that some philosophers like Daniel Stoljar are confronting (Stoljar, 2010), we should at least define “computation” in the context of philosophy of mind and cognitive science. Unlike the terms “body” and “physical”, we already have competing coherent definitions about the term “computation” in the context of cognitive science and philosophy of mind.

Deciding which competing definition is true is just one aspect of the reformulated mind-body problem. Another aspect is that which competing computational theories can handle some of the most challenging problems about the mind: consciousness, intentionality, abduction, and others. From there I could discuss about the problem of abduction and then argue that the problem of abduction is a serious mind-computation problem. Whereas computational processes operate solely on syntactic or local aspects of each individual mental representation, abduction operates on semantic or global properties of the network of mental representations. It would seem that abduction is not congruent to Fodor’s Classical Computational theory of mind.

Fodor’s problem of abduction is a serious problem for his theory of the computational mind, but can it happen with other computational theories of mind? Fodor does argue that other theories like Connectionism doesn’t explain it any better. I’m not going to explain it here, so check it out in his book “The Mind Doesn’t Work That Way” (2000). Whether or not Fodor is correct, his problem of abduction is at least analogous to the AI-Complete problem (check the wiki article or this article). There are specific problems in AI there are very challenging for AI Researchers such as determining what computational processes can realize natural language. A lot of these problems relate to what is called the frame problem (check Stanford Encyclopedia of Philosophy article). These problems are not unique to Fodor’s computational theory of mind, but pretty much any computational theory that tries to explain how the mind works.

These problems are part of the reformulated version of the mind-body problem or “mind-computation” problem. So, the mind-body problem can be reformulated as follows: (1) deciding which computational theory of mind best captures the appropriate definition of “computation” in the context of the mind and (2) philosophical problems that are either universal to every computational theory of mind or unique to some computational theory of mind. If we reformulated the mind-body problem in terms of (1) and (2) as the mind-computation problem, then we can easily avoid Chomsky’s critique of the mind-body problem as a vacuous problem. Because neither (1) nor (2) requires us to define the term “physical”, but only the term “computation”, which has plenty of competing theories defining it coherently, Chomsky’s critique doesn’t extend to the reformulated mind-computation problem. Unlike contemporary Physicalism which doesn’t define the physical, the computational theories of mind have their own understanding about the nature of “computation” in the context of the mind.

There are some objections I have considered. One possible objection could be from Searle. Searle could argue that the reformulated mind-body problem is wrong because it presupposes that there’s at least one correct computational theory of mind, but in fact every computational theory of mind is false. Searle could argue that his Chinese Room thought experiment (including variations of it) proves that every computational theory of mind is false, so the reformulated version of the mind-body problem should be rejected. Instead, we should accept another formulation of the mind-body problem that could just as well avoid Chomsky’s critique.

My only response to that objection is that at best Searle’s Chinese Room thought experiment poses a challenge to the Classical Computational theory of mind, but not necessarily every other computational theory of mind that has been proposed after Searle’s Chinese Room thought experiment (e.g. Piccinini’s mechanistic version of computation and Churchland’s connectionism). Moreover, Searle’s chinese room thought experiment is still controversial, so appealing to the thought experiment doesn’t provide sufficient ground to reject the reformulated version of the mind-body problem, but it does provide strong motives.

Another objection is that someone like Chomsky could argue that his critique could extend from the term “physical” to “computation”. It isn’t obvious that the term “computation” wouldn’t suffer from the lack of coherent  or meaningful definition as “physical”. If there is a wall with molecular configurations isomorphic to the formal description of the program “Wordstar”, it would be a computer that implements the program (Searle, 2002). This would be problematic because if anything like a wall qualifies as a computational process, then the claim “the mind is a computational system” is rendered trivial and vacuous. Consequently, even the mind-computation problem would suffer from an issue of defining “computation” just as the mind-body problem suffers from an issue of defining the term “physical”.

I think this is a much better objection than the first objection. In fact, there is an existing literature about this problem about finding a principled way to discriminate between a physical system genuinely performing computation from those that are merely described in terms of computation. The extant literature is covered by Piccinini in his Stanford Encyclopedia of Plato article “Computation in Physical Systems” (Piccinini, 2010). While many philosophers such as Ned Block (Block, 2002) and David Chalmers (Chalmers, 1996) dispute with Searle’s wall argument, the overall problem of discriminating physical systems performing genuine computation from others remains to be a real problem.

I’m not exactly sure how to approach this problem since it is beyond my knowledge and expertise. The problem of finding a principled basis to discriminate a physical system performing genuine computation from others that are merely described in terms of computation is a very broad problem pertinent to the computational theory of mind. If any physical system is performing computation (including the milky way, the sun, or others), the the computational theory of mind is stating something very trivial.

While I cannot tackle the whole problem, I do think there’s a response that can protect the reformulated version of the mind-body problem. Suppose that it’s the case that every physical system performs computation of some sort (this is known as Pancomputationalism). If that’s the case, then it is true that “the mind is performing computation” seems quite trivial. However, it isn’t trivial to state that the mind performs a very specific kind of computation in a certain way. Even in the context of pancomputationalism, it wouldn’t be trivial to say that “the mind performs computation via connectionist neural networks than symbol manipulation.” That statement isn’t as trivial as “the mind performs computation” in the same context. This is because the specific claim about the nature of computation in the context of the mind could have been false. It’s logically possible that in a pancomputationalist world the mind performs computation via symbol manipulation than connectionist neural network or vice versa. Hence, the mind-computation problem, specifically (1) there are competing theories with different definitions of computation in the context of the mind, isn’t rendered entirely trivial.

This isn’t the most rigorous way to write an academic philosophy paper, but this is more or less a rough sketch about what I could have rewritten; perhaps I can rewrite my paper in a more rigorous and clear manner. If my Chomsky paper gets rejected again, I might revisit it and make changes according to the proposed changes that I wrote in this blog entry. I might also add more citations.