The Assumption of Rationality
David Brooks's most recent op-ed in the New York Times questions the value of "false science" in intelligence analysis. [
link]. Among the points that he makes (and I'm summarizing broadly here) is that for decades, the CIA and other intelligence agencies have promoted the notion that analysis can be systematized and rationalized through the application of scientific theories, including game theory. Brooks goes on to say that the fundamental premise of this line of thinking -- and, by extension, the pseudo-scientific tools used by the analysts -- is faulty, since the very act of systematizing analysis precludes hunches and non-quantitative political judgments that have proven in many instances to be valuable precisely because they incorporate a sort of "fuzzy logic" that the more "scientific" analysis can't replicate.
I will leave to the experts the question of whether or not the systemization of political analysis is the most effective means to an end. But I think that Brooks is on to something.
By way of background, when I was a college student studying political science, I spent a fair amount of time in my junior and senior years pursuing an independent study with Stephen Brams, one of the leading theorists on political game theory. Among other things, Dr. Brams had written two books using Bible stories to illustrate game theoretical concepts used to analyze conflicts. [In game theory, all interactions are forms of conflict, which simply means that each party wants something that may be incompatible with what one or more other parties want.]
Now, one of the fundamental underpinnings of game theory is that all of the players in a conflict are rational, with rationality defined to mean that they have particular goals, that they attach varying degrees of importance to their goals, from most desireable to least desireable, and that they will prefer to take actions that help them achieve their most desireable outcome. Assuming that you can tease out the parties' respective preferences and the available courses of action, you can construct a grid from which it is possible to determine the most likely outcome of any given conflict.
When I first began studying, I was uncomfortable with the assumption that all players are rational because it is a fact of life that not all participants in a conflict are always rational. But although I questioned the assumption privately, I never followed up on it, even though Dr. Brams's own work led me to an example that might have been compelling evidence that irrationality is a factor as well. I figured that I must be missing something, and that challenging one of the fundamental assumptions of the whole theory was cheeky and simplistic. Turns out I might have been wrong. More on that in a minute.
What does all of this have to do with David Brooks and the CIA? Well, it turns out that Brooks is asking the same question, in essence. Game theory and its siblings assume that conflict can be analyzed rationally, which means that all players are assumed to act rationally. In the context of superpower conflicts, this might have been a valid base assumption, but it's not clear that today, in analyzing the intention of terrorist groups like al Qaeda, such an assumption is as defensible. If it's not, then it calls into question the utility of the various tools that the CIA and other analysts may have been relying on.
Rather than tackle that question head on, let me come back to the less charged and more concise example of irrationality that I thought of based on Dr. Brams's biblical examples, which comes from Numbers, Chapter 20:
2 The community was without water, and they joined against Moses and Aaron. 3 The people quarreled with Moses, saying, "If only we had perished when our brothers perished at the instance of the Lord! 4 Why have you brought the Lord's congregation into this wilderness for us and our beasts to die there? 5Why did you make us leave Egypt to bring us to this wretched place, a place with no grain or figs or vines or pomegranates? There is not even water to drink!" 6 Moses and Aaron came away from the congregation to the entrance of the Tent of Meeting, and fell on their faces. The Presence of the Lord appeared to them, 7 and the Lord spoke to Moses, saying, 8 "You and your brother Aaron take the rod and assemble the community, and before their very eyes order the rock to yield its water. Thus you shall produce water for them from the rock and provide drink for the congregation and their beasts."
9 Moses took the rod from before the Lord, as He had commanded him. 10 Moses and Aaron assembled the congregation in front of the rock; and he said to them, "Listen, you rebels, shall we get water for you out of this rock?" 11 And Moses raised his hand and struck the rock twice with his rod. Out came copious water, and the community and their beasts drank.
12 But the Lord said to Moses and Aaron, "Because you did not trust Me enough to affirm My sanctity in the sight of the Israelite people, therefore you shall not lead this congregation into the land that I have given them."
Without going too deeply into game theory, the rational actions that are available to Moses are (1) obey the word of God or (2) do nothing. His preferred outcome is to placate the rebels and to affirm the sanctity of God's word. Game theory would show that obeying the word of God leads to the desired outcomes, while doing nothing leads to undesirable outcomes (rebellion and perpetuating the rebel's disavowal of God). How then, do you explain what Moses actually does, namely striking the rock? In a word, irrationality. Moses is frustrated, he's tired and he lashes out. In other words, he fails to act rationally. And of course, he's punished for it, severely.
The problem for game theory is that an analyst examining the situation before Moses acts by striking the rock would not necessarily have predicted that he would strike the rock, because the model doesn't account for irrationality. But, as Brooks points out, an analyst acting on a hunch might look at the characteristics of the leader in that moment and see that he was frustrated with the rebels and therefore prone to lash out. In other words, the hunch might change fundamentally the nature of the analysis delivered. It might even predict the actual outcome.
In the context of the Cold War, the assumption of rationality in game theory may have worked because the two principal actors -- the US and the USSR -- were fundamentally rational, in that they each preferred results that furthered their geopolitical goals, and tended to disfavor results that did not advance them toward their goals.
But in the context of terrorism, it's not always clear that the terrorists act rationally. It used to be that you
could assume that terrorist groups acted rationally -- they hijacked a plane or carried out a targeted attack with the goal of freeing compatriots from prison, or forcing a withdrawal from some occupied territory or another -- but shied away from attacks that would get them in trouble with their patrons or that would bring about their own destruction. Al Qaeda and its brethren, however, preach the destruction of the West as the ultimate goal, and carry out attacks that provoke a massive and all-encompassing response by the target; in a word, their actions are not rational, if rationality is defined as taking actions that tend to advance you toward your ultimate goal.
As such, Brooks may be right that in analyzing for possible terrorist activity, traditional "scientific" tools won't help to analyze current events and predict future outcomes. Hunches may exactly what are called for, and retooling the intelligence apparatus to include people who can make educated guesses may be precisely what is needed.
Of course, a strong corollary to such an overhaul is that a system of controls needs to be implemented in order to, as much as possible, insulate the people doing the educated guesses from being influenced to produce certain pre-determined predictions. But that's a subject for another day...