657Book Reviews no circumstances did they believe it might replace human judgment. During the Cold
War, however, many social scientists, nuclear strategists, and military planners aspired to replace reason—which the authors define as the careful application of human judgment, including moral thought, to decision making—with formalistic rationality. Researchers and policy makers sought to replace the messy practice of human judgment with optimized, rule-based protocols that would yield unambiguous solutions to intellectual and policy problems.
This aspiration nurtured quintessential
Cold War fields such as game theory and systems analysis, but it also profoundly shaped academic disciplines from economics to social psychology to evolutionary biology. The authors offer far more than histories of rationality in various Cold War disciplines. They want to understand why formalistic rationality held such powerful appeal for the social sciences, particularly given that proponents of formalistic rationality were also very aware of its intellectual and policy limits. The authors do an excellent job of probing debates about the meaning, possibilities, and limits of rationality between the 1940s and the 1970s. The early generation of game theorists and nuclear strategists believed that rationality “could be captured by a finite, well-defined set of rules to be applied unambiguously” (p. 29).
Importantly, they conceived of rationality in economic terms; the rational action, they assumed, maximized gains and minimized losses. As chapters on operations research, nuclear strategy, and game theory explain, this approach was powerful, in part, because it was so flexible; losses might be inefficiencies in the
Berlin Airlift, years spent in prison after one’s criminal accomplice confessed, or lives lost in a nuclear holocaust. Scholars in administrative science, conflict resolution, and social psychology, however, worried that humans’ irrationality and cognitive inadequacies rendered the quest for Cold War rationality unrealistic.
Herbert Simon’s work on procedural rationality, Charles E. Osgood’s method for graduated and reciprocal nuclear de-escalation, and Irving Janis’s revelations about groupthink challenged formal rationality in different ways. Yet, the authors argue, these thinkers ironically reinforced rationality’s intellectual purchase by holding it out as either an ideal model for human thought or as a yardstick against which to measure human shortcomings. Even if the
Cold War model of rationality was unrealistic in a world of rampant human irrationality and limited problem-solving abilities, its ability to simplify complex problems, from allocating scarce resources to managing nuclear escalation, made it highly seductive.
No book is without flaws, and at times the authors overstate the foreignness of Cold
War thought and the collapse of Cold War rationality in postwar social scientific research.
Many aspects of Cold War rationality—particularly mathematical formalism, contextual simplification, and the embrace of economic logic—live on in much of political science, sociology, policy analysis, and of course, economics. These are minor criticisms, however.
This masterly book makes a crucial contribution to understanding of Cold War thought, opens many new avenues for further research, and raises important questions about the durability of Cold War thinking in contemporary
American social science.
University of Michigan
Ann Arbor, Michigan doi: 10.1093/jahist/jau385
Arguments That Count: Physics, Computing, and Missile Defense, 1949–2012. By Rebecca
Slayton. (Cambridge: mit Press, 2013. xii, 325 pp. $35.00.)
How did American elites make decisions when they were confronted with complex, contradictory, or incomplete information? What arguments and evidence changed their minds?
Could they be persuaded to act against existing ideological and material commitments?
Rebecca Slayton’s excellent Arguments That
Count documents campaigns of persuasion surrounding an enduring Cold War technological fantasy: a missile defense system. Slayton is to be commended for providing a rich example of the inability of American politicians and scientists to reconcile technical exat Illinois U niversity on M arch 11, 2015 http://jah.oxfordjournals.org/
D ow nloaded from 658 The Journal of American History September 2014 pertise, democratic ideals, and their ambitions in domestic and international arenas.
Arguments That Count begins in the aftermath of World War II, when physicists enjoyed unprecedented prestige and influence in the American federal government. They struggled, however, to design and build systems that would protect Americans from the threat of Soviet nuclear intercontinental ballistic missiles. Successive chapters move between two expert communities: the powerful physicists and the nascent community of “software engineers.” Slayton’s account of the latter group complements two recent prizewinning books in the history of computing—Paul N. Edwards’s A Vast Machine (2010) and Joseph November’s Biomedical Computing (2012)—that demonstrate how Cold War computing took shape when groups of well-funded experts saw computers as tools that could help them solve specific data-intensive problems. Arguments
That Count leaves no doubt about the overwhelming presence of military funding and military priorities at the origins of software engineering. The money and ideas came through institutions such as the North Atlantic Treaty
Organization, the Department of Defense, the
National Science Foundation, and the National Security Agency—and certainly not from venture capitalists or countercultural characters in California.
Slayton provides a crisp and well-paced description of the American military-industrial-academic complex in action: officials in different branches of government competed for power, scientists grappled with technical uncertainty and status anxiety, and a few expert insiders mobilized fellow professionals and tried to engage broader publics. Most elite physicists supported funding for missile defense, but they failed to comprehend the fundamental complexity and unreliability of computerized command and control systems.