Critique of Emotional Reason
Paul Thagard
Philosophy Department
University of Waterloo
pthagard@uwaterloo.ca

Introduction


In the title of her most recent book, Susan Haack (1998) describes herself as a passionate moderate. With eloquence and verve, she ably defends the rationality of science and philosophy against a host of post-modernist critics. Haack (p. 49) approvingly quotes Peirce's descriptions of the scientific attitude as "a craving to know how things really are," "an intense desire to find things out," and "a great desire to learn the truth". But despite the passion evident in her own arguments and in her citations of Peirce, Haack's explicit epistemology is emotion-free: the word "emotion" does not appear in the index of her treatise on epistemology (Haack, 1993). She does however remark in her later book on the "misconception of the emotions as brutally irrational" (Haack, 1998, p. 142).


In this chapter I want to address the question of how emotions are relevant to rationality. Answering this question should further the development of theories of theoretical and practical reason that take seriously the pervasive role of emotions in human thinking. I want to reject both the classical view that emotions are primarily an impediment to reason and the romantic view that emotionality is inherently superior to reason. Instead, I develop a critical view that delineates the ways in which emotions contribute to reason as well as the ways in which emotions can impede reason. My title uses the term "critique" to mean not criticism but assessment. I will argue that emotional cognition can provide a useful supplement to Haack's "foundherentist" epistemology, and also extend her discussion of the practical question of affirmative action.


Theoretical and Practical Reason


Emotion is relevant to theoretical reason, which concerns what to believe, and to practical reason, which concerns what to do. The main kind of theoretical reason I want to discuss is what Peirce called abduction, the formation and acceptance of explanatory hypotheses. Abductive inference has two phases: the generation of hypotheses and then their evaluation, ideally leading to the acceptance of hypotheses that are part of the best explanation of the evidence. Abduction is common in everyday life, for example when I make inferences about why my car fails to start or why a friend is in a bad mood; but it is also central to scientific theorizing, for example in the generation and evaluation of hypotheses concerning the causes of disease. My aim is to lay out ways in which emotions affect the generation and evaluation of scientific hypotheses.
Practical reasoning also has two phases: the generation of possible actions and their evaluation, ideally leading to choice of those actions that best satisfy the goals relevant to the decision. For example, when members of an academic department make a hiring decision, they have to choose whom to appoint from the typically large number of candidates for the position. I will discuss how emotions affect both the generation and evaluation of actions. Finally, I will try to generalize concerning how emotions do and should affect theoretical and practical reason.


Theoretical Reason: Generation


As Peirce (1931-1958, 5.591) noted, and as Fodor (2000) recently reiterated, it is a puzzle why and how abductive inference is ever successful. There is a huge range of facts that people might try to explain, and for each fact there is a huge range of hypotheses that might explain it. Fodor throws up his hands and says that abduction is a terrible problem for cognitive science, while Peirce postulates an innate ability, derived by evolution, for forming and choosing good hypotheses. Preferable to both these rather desperate conclusions is the attempt to specify the cognitive mechanisms by which explanatory hypotheses are generated (see, for example: Thagard 1988; Josephson and Josephson 1994). Little noticed has been the fact that these mechanisms are in part emotional.


Consider first the problem of deciding what to attempt to explain. As Peirce observed, abductive inference is usually prompted by surprise, the emotional reaction that occurs when our expectations are violated. Most of the facts that we encounter are not surprising, in that they fit perfectly well with what we already believe. Occasionally, however, we encounter facts that do not fit with our beliefs, generating surprise by a process of emotional incoherence (Thagard 2000, p. 194). Hence we substantially focus our abductive activity by narrowing it down to events that have the emotional character of being surprising. Other emotions that stimulate explanatory activity include curiosity, in which people have the emotional drive to find out answers to questions that interest them. For example, Watson and Crick had a passion to find out the structure of DNA (Thagard, forthcoming). In addition to surprise and curiosity, abductive inference can be emotionally prompted by practical need, for example when a doctor urgently tries to find a diagnosis that explains the symptoms of a seriously sick patient. Thus we do not try to explain everything, just those events and facts that we care about.


Once emotion initiates the attempt to generate explanatory hypotheses, it can also serve to focus the search for them. Scientists often get excited about particular lines of thinking and as a result pursue them intensively. As Peirce noticed, there is no way that a person could do an exhaustive search of all possible explanatory hypotheses for a set of data. Moreover, it has been proven that abduction is computationally intractable, in the sense that the time required to assemble a set of hypotheses to explain data grows exponentially with the number of propositions involved (Bylander et al., 1991), so no computer could do an exhaustive search either. However, human thinking and current computational models use heuristic techniques such as backward chaining of rules and spreading activation of concepts to narrow search for hypotheses.


I conjecture that emotion is a valuable part of heuristic search in humans. Once a fact to be explained is tagged as emotionally important, then concepts and rules relevant to explaining it can also be tagged as emotionally important. For example, when Watson and Crick were working to find the structure of DNA, they got excited about ideas such as the double helix that seemed most relevant to divining the structure. Abduction is sometimes described as having the structure:


E is to be explained.
H could cause E.
So maybe H.


This structure obscures the fact that considerable thinking goes into putting together the hypothesis H. To take a simple example, the ancient Greeks came up with the wave theory of sound to explain various properties of sound, such as propagation and echoing. The hypothesis that sound consists of waves requires the linkage of three concepts: sound, waves, and consists. For these concepts to be assembled into a proposition, they need to be simultaneously active in working memory. Emotion may help to bring potentially useful combinations into juxtaposition. Chrysippus, the Greek stoic who first constructed the wave theory of sound, was probably puzzled by the behaviors of sounds. When something about water waves caught his attention, he was able to generate the hypothesis that sound consists of waves. Emotional involvement with a concept, either moderately as interest or more passionately as excitement, focuses attention on it. When interest in one concept gives rise to interest in another relevant one, as when sound gets linked to wave, there emerges the powerful excitement that signals a discovery.


Emotions are also relevant to guiding the finding of analogies that often contribute to scientific discovery (see Holyoak and Thagard, 1995, ch. 8). For example, if the fact to be explained F1 is analogous to another fact F2 that has already been explained, and if scientists has a positive emotional attitude toward the kind of hypothesis H2 that explains F2, then they may get excited about finding an analog of H2 to explain F1. Thus analogy can transfer positive emotions from one theory to another similar one that is judged to be promising (Thagard and Shelley, 2001). Analogy can also transfer negative emotions, for example when a potential research program is compared to cold fusion and thereby discouraged.


Thus emotion seems to be an important part of the generation of explanatory hypotheses, both in selecting what is to be explained and in guiding the search for useful hypotheses. On the negative side, emotions may unduly narrow the search for fruitful explanations. If scientists become obsessed with a particular kind of hypothesis, they may be blinded from discovering a very different kind of hypothesis needed to explain some particularly puzzling fact. Emotions serve a valuable cognitive function in narrowing the search for hypotheses, but like any heuristic mechanism they can take the search in undesirable directions. Misdirection can occur especially if the goals that provoke emotional excitement are personal rather than intellectual ones. If scientists get excited about a particular kind of hypothesis because it might make them rich and famous, they may be blinded from searching for less lucrative hypotheses that have greater explanatory potential.
Theoretical Reason: Evaluation


Someone who wants to maintain the classical position that emotion should not be part of theoretical reason might respond to the previous section as follows: Granted, emotion has a useful role to play in the context of discovery, but it must be kept out of the context of justification, which it can only distort. To be sure, there are a variety of ways that emotions can distort theoretical reason in general and abductive inference in particular, as I will shortly describe. But there is an argument consistent with Haack's epistemology that shows a crucial role for emotion in the evaluation of explanatory hypotheses.


Haack (1993) defended a powerful and plausible epistemological position she dubbed foundherentism. It combines the foundationalist insight that empirical evidence has a special role to play in justifying theories with the coherentist insight that such evidence cannot be taken as given but must be assessed with respect to overall explanatory integration. Haack uses the apt analogy of a crossword puzzle, in which there must be an integration of entries and clues, to indicate how in abductive inference there must be a coherence of hypotheses and evidence. She does not, however, provide a method or algorithm for assessing the success of such integration in particular cases. How do people judge that they have a good fit in their answers to a crossword puzzle, and how do scientists decide that one theory fits better with the evidence than its competitors?


I have developed a theory of explanatory coherence that shows how the best explanation can be efficiently computed with artificial neural networks (Thagard, 1992). I have also argued that this theory, which gives a degree of priority to empirical evidence while maintaining a coherentist perspective, implements and naturalizes Haack's foundherentism (Thagard, 2000). I will not repeat that argument here, but want to draw a conclusion concerning the role of emotions.


Suppose I am right that foundherentist abductive inference is accomplished in humans by a neural network that performs parallel constraint satisfaction in a way that maximizes explanatory coherence. People have no conscious access to this process: when you realize you prefer one theory over another, you do not really know why, although you may be able to retrace your intellectual history of acquiring the various hypotheses and evidence that contributed to your preference. All you can say is that one theory "makes more sense" to you than the other. This is not to say that inferences based on explanatory coherence are irrational, since they may well involve maximization of coherence with respect to the available hypotheses and evidence. But given the limited access to mental processes, there is no way you can directly know that you have maximized coherence.


This is where emotions are crucial. According to my recent theory of emotional coherence, the way you know that you have achieved explanatory coherence is by a feeling of happiness that emerges from the satisfaction of many constraints in your neural network (Thagard 2000, p. 194ff.) Highly coherent theories are praised by scientist for their elegance and beauty. Because we cannot extract from our brains judgments such as "Acceptance of theory T1 satisfies .69 of the relevant constraints", we have to fall back on the overall emotional judgment such as "T1 just makes sense." Thus the feeling of happiness that emerges from a coherence judgment is part of our ability to assess scientific theories. Ideally, a good theory generates a kind of emotional gestalt that signals its coherence with the evidence and the rest of our beliefs. Negative emotions attached to a theory signal that it does not fit with our other beliefs, and a general feeling of anxiety may signal that none of the available theories is very coherent. Such anxiety may trigger a search for new hypotheses.


So a gut feeling that represents an emotional gestalt may be a valid sign of a highly coherent evaluation of competing hypotheses. The problem is that such a feeling may instead signal a different kind of coherence based on wishful thinking or motivated inference rather than fit with the evidence. I once got a letter from someone urging me to send my explanatory coherence computer program to him right away, because he had a theory of the universe that no one else believed, and my program might help him convince other people. Presumably his attachment to his theory was based more on satisfaction of personal goals than on it providing the best explanation of all the evidence. Noting the role of emotion in appreciating coherence may seem to endorse the romantic view: If it feels good, believe it.


I know of no psychological way of distinguishing between the positive emotion of emotional coherence from the similar emotion of self-promotion. Here it is important that science is a social as well as an individual process. Scientists know that the fact that they like their own theories cuts no ice with other scientists with different personal motivations. In order to publish their work or to get grants to continue it, scientists' hypotheses must stand up to peer review, where the peers are people familiar with the relevant hypotheses and evidence and not moved by the personal goals of the submitting scientist. Awareness of this scrutiny requires scientists to calibrate their coherence judgments against what the scientific community is likely to accept, and influences them to make their own assessments on the basis of explanatory coherence rather than personal expedience. In everyday life, people encounter less social pressure to reason in accord with evidence and explanatory coherence, so they are more likely to succumb to evidence-poor but personally-coherent hypotheses such as conspiracy theories. Even in science there is the danger that the emotional bias of one individual may spread to associates by a kind of emotional contagion.


I will use the term emotional skewer for a factor that is so vivid and affectively intense that it can produce an emotional gestalt that does not reflect the maximal satisfaction of the relevant and appropriate constraints. Such factors skew the coherence calculation by placing too much weight on epistemically irrelevant factors such as self-promotion. We will see that emotional skewers also are a threat to rational judgment in practical reasoning.


Practical Reason: Generation


In order to decide what to do, we need to generate a set of possible actions and then choose the best. Bazerman (1994, p. 4) says that rational decision making should include the following six steps:
1. Define the problem, characterizing the general purpose of your decision.
2. Identify the criteria, specifying the goals or objectives that you want to be able to accomplish.
3. Weight the criteria, deciding the relative importance of the goals.
4. Generate alternatives, identifying possible courses of action that might accomplish your various goals.
5. Rate each alternative on each criterion, assessing the extent to which each action would accomplish each goal.
6. Compute the optimal decision, evaluating each alternative by multiplying the expected effectiveness of each alternative with respect to a criterion times the weight of the criterion, then adding up the expected value of the alternative with respect to all criteria.
Then pick the alternative with the highest expected value.


But experts on decision making usually take for granted step 4, in which alternative actions are generated. Creative decision making often involves not just choosing the best among available alternatives, but in expanding the range of alternatives. Consider, for example, hiring decisions in academic departments. The usual course of action is something like:


1. Assess the needs of the department and establish the area or areas in which to hire. Then advertise a position in these areas.
2. Examine all applications and choose a short list for further discussion.
3. Choose a few applicants to invite to campus.
4. Select a candidate to receive a job offer.


Emotions play a major role in all of these stages.


When departments decide in which areas to hire, there is often a conflict based on the differing goals of individual members. Professors typically value their own specialties over those of their colleagues, so they often have an emotional bias towards hiring in their own areas. But a thorough assessment of the pedagogical and research needs of a department should lead to a more reasonable choice of areas. No data are available on the role of emotions in generating alternatives for decision making, but I expect that the cognitive process is similar to that involved in generating hypotheses in abductive reasoning. There are routine cognitive processes such as backward chaining that uses rule-like knowledge of action-goal connections. For example, if the department members know that there is a ethics course that needs teaching, then they might think of hiring someone in ethics. But hiring priorities might also be based on more amorphous considerations, such as ideas about what areas are emerging as important. Diffuse mechanisms such as spreading activation among concepts may contribute, but would be little use in focusing the search for good ideas.


As with the search for hypotheses, emotions are a useful way to focus the search for decision alternatives. Certain ideas may get high positive valence and suggest other ideas with positive valence. For example, my own department recently learned that Waterloo is getting a very well funded new institute in theoretical physics, so it occurred to some of us that it might be desirable to hire someone in philosophy of physics to provide a connection with this institute. The occurrence was based on excitement about the new institute which spurred excitement about connecting philosophy with it which spurred excitement about the possibility of adding a department member with the relevant specialty.


In academic hiring, the set of applicants is usually determined by the advertisement, so at this stage the range of alternatives is fixed. But more creative hiring often involves actively searching out candidates rather than just passively waiting for the applications to come in. Once again emotions play a role: people tend to think of potential candidates whose work they like or whom they like personally. Of course, emotions can also work negatively, ruling out people whose work is not respected or whose personalities are suspect. But emotions can be useful in the process of headhunting, i.e. trying to identify candidates to urge to apply rather than simply waiting to see who applies. Of course, emotions can also prevent options from being generated, as when a name evokes an "over my dead body" response.


Practical Reason: Evaluation


Once a range of alternative actions is identified, for example a list of job applicants, then the department must decide whom to hire. In contrast to the systematic approach to decision making recommended by Bazerman, department members usually proceed in a quite haphazard fashion. Members of the hiring committee look over the applications, and select a small number to comprise a short list. This selection is both an individual and a group process, as committee members first make their own short lists and then meet to agree upon a common short list. Ideally, the members individually and cooperatively work with a set of criteria, such as:


1. Research quality, as shown by publications and letters of reference.
2. Teaching ability, as shown by experience and student course evaluations.


However, such objectivity may go out the window when candidates come to campus for personal interviews. Then personal interactions and quality of performance in the job talk may swamp the more extensive information contained in the applicants dossier. A candidate with a charming personality may win out over an introvert with a better academic record. On the other hand, the department hiring may make a rational decision to choose the candidate who best fits their reasonably established criteria.


In either case, I conjecture, the individuals in the department reach their decisions by means of a kind of emotional gestalt that summarizes how they feel about particular candidates. They choose to hire candidates that they feel good about, and reject candidates that they do not feel good about. As with hypothesis evaluation in science, we have no direct access to the unconscious mental processes that integrate various criteria to produce a coherent decision. Decision making is a coherence process similar in structure and processing to hypothesis evaluation (see the DECO model of decision making in Thagard and Millgram, 1995, and Thagard, 2000). One might try working with pencil-and-paper models that implement the kind of decision-making procedure described by Bazerman above, but doing so is difficult because it is hard to ensure that the numerical weightings one applies actually correspond to the importance one attaches to the various criteria. Moreover, if one went through this exercise and found that it suggested hiring someone with a negative emotional gestalt, it would not be unreasonable to think that the numerical process had gone astray.


However, it must be admitted that practical reason is even more susceptible to emotional skewers than is theoretical reason. For example, a person trying to decide what to have for dinner may want to make the decision based on a strong aim to eat healthy foods, but be seduced by a menu into ordering a cheeseburger with French fries. Here the momentary but intense desire for high-fat food swamps the long-term interests of the eater and serves as an emotional skewer. The familiar phenomena of weakness of will and self deception can be best understood as the result of emotional skewers.


Hiring decisions are unquestionably susceptible to emotional skewers. Faced with hundreds of applicants for a job, it is natural for members of a philosophy department to eliminate individuals on the basis of solitary criteria: this one hasn't published, that one has no teaching experience, and so on. There is nothing wrong with such elimination if the criteria are the ones appropriate for the position, but stereotypes and prejudices operating consciously or unconsciously can have the same effect in normatively inappropriate ways. They can lead, for example, to the summary rejection of women with young children, or to gay or lesbian candidates who just would not "fit in" with the department. Members of an academic department may have an unconscious schema of the appropriate sort of colleague to have, a schema that is based in substantial part on themselves. Someone who does not fit that schema because of features that are irrelevant for job performance may nevertheless be considered just not right for the job.


The operation of emotional skewering shows why preferential hiring is sometimes necessary. Like Haack, (1998, ch. 10), I would much prefer a hiring process in which the best candidate for a job is hired, which, given the plausible assumption that women are as academically talented as men, would over time lead to the elimination of gender bias in academic hiring. But there have undoubtedly been departments where emotional skewers are so pronounced that fair hiring is very unlikely. An American philosopher once told about the difficulties that his illustrious department had had in placing their female candidates. For example, the chair of one hiring department told him: "Don't bother telling us about your women students ­ we don't have to hire a woman yet." In such cases, administrative action to enforce hiring of women is the only way to change the department in ways that allow the development of fair hiring practices.


Haack (1998, p. 173) is well aware of the role of emotions in hiring decisions, describing the hiring process as a combination of greed and fear:


Greed: we want someone who will improve the standing of the department, who has contacts from which we might benefit, who will willingly do the teaching we'd rather not do, who will publish enough so the tenure process will go smoothly. Fear: we don't want someone so brilliant or energetic that they make the rest of us look bad, or compete too successfully for raises and summer money, or who will vote with our enemy on controversial issues.


Any of these greed factors and fear factors, alone or in combination, can serve as emotional skewers that contribute to an emotional gestalt that produces an unfair and suboptimal hiring decision. So should we just try to turn off the emotional contribution to decision making and proceed in as analytical a manner as possible using something like Bazerman's procedure?
I doubt that it is possible or even desirable to turn hiring and other decisions into matters of cold calculation (Thagard, 2001). Our brains are wired with many interconnections between cognitive areas such as the neocortex and emotional areas such as the amygdala (Damasio, 1994). There is no way we can shut down the amygdala, which contributes to emotional assessments via dense connections with bodily states. Moreover, if Damasio's patients (who have brain damage that interferes with neocortex-amygdala connections) are a good indication, then shutting down the amygdala would worsen rather than improve decision making. These patients have normal verbal and mathematical abilities, but tend to be ineffective and even irresponsible in their personal and social decisions. Hence I do not think that we can write emotions out of the practical reasoning process.


Still, there is lots of room for improvement in decisions based on emotional intuitions. Thagard (2001) advocates the process of informed intuition:


1. Set up the decision problem carefully. This requires identifying the goals to be accomplished by your decision and specifying the broad range of possible actions that might accomplish those goals.
2. Reflect on the importance of the different goals. Such reflection will be more emotional and intuitive than just putting a numerical weight on them, but should help you to be more aware of what you care about in the current decision situation. Identify goals whose importance may be exaggerated because of emotional distortions.
3. Examine beliefs about the extent to which various actions would facilitate the different goals. Are these beliefs based on good evidence? If not, revise them.
4. Make your intuitive judgment about the best action to perform, monitoring your emotional reaction to different options. Run your decision past other people to see if it seems reasonable to them.


This procedure is obviously applicable to hiring decisions and would, I hope lead to the hiring of the best person for the job.


This model of decision making applies to individual thinkers, but much decision making is social, involving the need to accommodate the views and preferences of various members in a group. Emotions can definitely be a hindrance to social decision making, if some members are irrationally exuberant about or obstinately resistant to particular options. But emotions can also be valuable in communicating people's preferences and evaluations. If I am trying to coordinate with people and see that they are getting very upset by a potential action such as hiring a candidate, I can realize that their strong emotions signal a strong preference against the action. Alternatively, if they are as enthusiastic as I am, then together we can achieve a kind of group emotional coherence is that is very good for social solidarity.


Conclusion


In sum, both theoretical and practical reason involve processes of generating alternatives and evaluating them in order to select the best. Both generation and evaluation involve emotions, and the involvement is often positive, when emotions guide the search for attractive alternatives and when they signal a gestalt that marks the achievement of a maximally coherent state of mind. There is no realistic prospect of stripping these emotional contributions from generation and evaluation. I would not want it otherwise, since passion adds much to science, philosophy, and everyday life. But the involvement of emotions can also be negative, when emotional skewers impede the generation of attractive alternatives and distort the evaluation of which alternatives are the best. To overcome these negative effects of emotions, we need to adopt procedures such as informed intuition that recognize and encourage the contributions of affect to theoretical and practical reason, while watching for the presence of distortions. In addition to being passionately moderate, one should aim to be moderately passionate.

References


Bazerman, M. H. (1994). Judgment in managerial decision making. New York: John Wiley.
Bylander, T., Allemang, D., Tanner, M., & Josephson, J. (1991). The computational complexity of abduction. Artificial Intelligence, 49, 25-60.
Damasio, A. R. (1994). Descartes' error. New York: G. P. Putnam's Sons.
Fodor, J. (2000). The mind doesn't work that way. Cambridge, MA: MIT Press.
Haack, S. (1993). Evidence and inquiry: Towards reconstruction in epistemology. Oxford: Blackwell.
Haack, S. (1998). Manifesto of a passionate moderate. Chicago: University of Chicago Press.
Holyoak, K. J., & Thagard, P. (1995). Mental leaps: Analogy in creative thought. Cambridge, MA: MIT Press/Bradford Books.
Josephson, J. R., & Josephson, S. G. (Eds.). (1994). Abductive inference: Computation, philosophy, technology. Cambridge: Cambridge University Press.
Peirce, C. S. (1931-1958). Collected papers. Cambridge, MA: Harvard University Press.
Thagard, P. (1988). Computational philosophy of science. Cambridge, MA: MIT Press/Bradford Books.
Thagard, P. (1992). Conceptual revolutions. Princeton: Princeton University Press.
Thagard, P. (2000). Coherence in thought and action. Cambridge, MA: MIT Press.
Thagard, P. (2001). How to make decisions: Coherence, emotion, and practical inference. In E. Millgram (Ed.), Varieties of practical inference (pp. 355-371). Cambridge, MA: MIT Press.
Thagard, P. (forthcoming). The passionate scientist: Emotion in scientific cognition. In P. Carruthers, S. Stich & M. Siegal (Eds.), The cognitive basis of science. Cambridge: Cambridge University Press.
Thagard, P., & Millgram, E. (1995). Inference to the best plan: A coherence theory of decision. In A. Ram & D. B. Leake (Eds.), Goal-driven learning: (pp. 439-454). Cambridge, MA: MIT Press.
Thagard, P., & Shelley, C. P. (2001). Emotional analogies and analogical inference. In D. Gentner & K. H. Holyoak & B. K. Kokinov (Eds.), The analogical mind: Perspectives from cognitive science (pp. 335-362). Cambridge, MA: MIT Press.

Back to Paul Thagard's recent articles page.

Back to Paul Thagard's biography page.

Back to Computational Epistemology Laboratory.