Download Free Inventing Accuracy Book in PDF and EPUB Free Download. You can read online Inventing Accuracy and write the review.

"Mackenzie has achieved a masterful synthesis of engrossing narrative, imaginative concepts, historical perspective, and social concern." Donald MacKenzie follows one line of technology—strategic ballistic missile guidance through a succession of weapons systems to reveal the workings of a world that is neither awesome nor unstoppable. He uncovers the parameters, the pressures, and the politics that make up the complex social construction of an equally complex technology.
Most aspects of our private and social lives—our safety, the integrity of the financial system, the functioning of utilities and other services, and national security—now depend on computing. But how can we know that this computing is trustworthy? In Mechanizing Proof, Donald MacKenzie addresses this key issue by investigating the interrelations of computing, risk, and mathematical proof over the last half century from the perspectives of history and sociology. His discussion draws on the technical literature of computer science and artificial intelligence and on extensive interviews with participants. MacKenzie argues that our culture now contains two ideals of proof: proof as traditionally conducted by human mathematicians, and formal, mechanized proof. He describes the systems constructed by those committed to the latter ideal and the many questions those systems raise about the nature of proof. He looks at the primary social influence on the development of automated proof—the need to predict the behavior of the computer systems upon which human life and security depend—and explores the involvement of powerful organizations such as the National Security Agency. He concludes that in mechanizing proof, and in pursuing dependable computer systems, we do not obviate the need for trust in our collective human judgment.
In Perform or Else Jon McKenzie brilliantly explores the relationship between cultural, organisational, and technological performance.
In An Engine, Not a Camera, Donald MacKenzie argues that the emergence of modern economic theories of finance affected financial markets in fundamental ways. These new, Nobel Prize-winning theories, based on elegant mathematical models of markets, were not simply external analyses but intrinsic parts of economic processes. Paraphrasing Milton Friedman, MacKenzie says that economic models are an engine of inquiry rather than a camera to reproduce empirical facts. More than that, the emergence of an authoritative theory of financial markets altered those markets fundamentally. For example, in 1970, there was almost no trading in financial derivatives such as "futures." By June of 2004, derivatives contracts totaling $273 trillion were outstanding worldwide. MacKenzie suggests that this growth could never have happened without the development of theories that gave derivatives legitimacy and explained their complexities. MacKenzie examines the role played by finance theory in the two most serious crises to hit the world's financial markets in recent years: the stock market crash of 1987 and the market turmoil that engulfed the hedge fund Long-Term Capital Management in 1998. He also looks at finance theory that is somewhat beyond the mainstream—chaos theorist Benoit Mandelbrot's model of "wild" randomness. MacKenzie's pioneering work in the social studies of finance will interest anyone who wants to understand how America's financial markets have grown into their current form.
Whole World on Fire focuses on a technical riddle wrapped in an organizational mystery: How and why, for more than half a century, did the U.S. government fail to predict nuclear fire damage as it drew up plans to fight strategic nuclear war?U.S. bombing in World War II caused massive fire damage to Hiroshima and Nagasaki, but later war plans took account only of damage from blast; they completely ignored damage from atomic firestorms. Recently a small group of researchers has shown that for modern nuclear weapons the destructiveness and lethality of nuclear mass fire often--and predictably--greatly exceeds that of nuclear blast. This has major implications for defense policy: the U.S. government has underestimated the damage caused by nuclear weapons, Lynn Eden finds, and built far more warheads, and far more destructive warheads, than it needed for the Pentagon's war-planning purposes. How could this have happened? The answer lies in how organizations frame the problems they try to solve. In a narrative grounded in organization theory, science and technology studies, and primary historical sources (including declassified documents and interviews), Eden explains how the U.S. Air Force's doctrine of precision bombing led to the development of very good predictions of nuclear blast--a significant achievement--but for many years to no development of organizational knowledge about nuclear fire. Expert communities outside the military reinforced this disparity in organizational capability to predict blast damage but not fire damage. Yet some innovation occurred, and predictions of fire damage were nearly incorporated into nuclear war planning in the early 1990s. The author explains how such a dramatic change almost happened, and why it did not. Whole World on Fire shows how well-funded and highly professional organizations, by focusing on what they do well and systematically excluding what they don't do well, may build a poor representation of the world--a self-reinforcing fallacy that can have serious consequences. In a sweeping conclusion, Eden shows the implications of the analysis for understanding such things as the sinking of the Titanic, the collapse of the Tacoma Narrows Bridge, and the poor fireproofing in the World Trade Center.
"Mackenzie has achieved a masterful synthesis of engrossing narrative, imaginative concepts, historical perspective, and social concern." Donald MacKenzie follows one line of technology—strategic ballistic missile guidance through a succession of weapons systems to reveal the workings of a world that is neither awesome nor unstoppable. He uncovers the parameters, the pressures, and the politics that make up the complex social construction of an equally complex technology.
This book provides a complete history of the US Fleet Ballistic Missile programme from its inception in the 1950s and the development of Polaris to the deployment of Trident II in 1990. Writing in an accessible yet scholarly manner, Graham Spinardi bases his historical documentation of FBM development on interviews with many of the key participants. His study confronts a central issue: is technology simply a tool used to achieve the goals of society, or is it an autonomous force in shaping that society? FBM accuracy evolved from the city-busting retaliatory capability of Polaris to the silo-busting 'first strike' potential of Trident. Is this a case of technology 'driving' the arms race, or simply the intended product of political decisions? The book provides a comprehensive survey of the literature looking at the role of technology in the arms race, and seeks to explain technological development using a 'sociology of technology' approach.
The Values of Precision examines how exactitude has come to occupy such a prominent place in Western culture. What has been the value of numerical values? Beginning with the late eighteenth century and continuing into the twentieth, the essays in this volume support the view that centralizing states--with their increasingly widespread bureaucracies for managing trade, taxation, and armies--and large-scale commercial enterprises--with their requirements for standardization and mass production--have been the major promoters of numerical precision. Taking advantage of the resources available, scientists and engineers have entered a symbiotic relationship with state and industry, which in turn has led to increasingly refined measures in ever-widening domains of the natural and social world. At the heart of this book, therefore, is an inquiry into the capacity of numbers and instruments to travel across boundaries of culture and materials. Many of the papers focus attention on disagreements about the significance and the credibility of particular sorts of measurements deployed to support particular claims, as in the measures of the population of France, the electrical resistance of copper, or the solvency of insurance companies. At the same time they display the deeply cultural character of precision values. Contributors to the volume include Ken Alder, Graeme J. N. Gooday, Jan Golinski, Frederic L. Holmes, Kathryn M. Olesko, Theodore M. Porter, Andrea Rusnock, Simon Schaffer, George Sweetnam, Andrew Warwick, and M. Norton Wise.
Lost in the raging debate over the validity of social construction is the question of what, precisely, is being constructed. Facts, gender, quarks, reality? Is it a person? An object? An idea? A theory? Each entails a different notion of social construction, Ian Hacking reminds us. His book explores an array of examples to reveal the deep issues underlying contentious accounts of reality. Especially troublesome in this dispute is the status of the natural sciences, and this is where Hacking finds some of his most telling cases, from the conflict between biological and social approaches to mental illness to vying accounts of current research in sedimentary geology. He looks at the issue of child abuse—very much a reality, though the idea of child abuse is a social product. He also cautiously examines the ways in which advanced research on new weapons influences not the content but the form of science. In conclusion, Hacking comments on the “culture wars” in anthropology, in particular a spat between leading ethnographers over Hawaii and Captain Cook. Written with generosity and gentle wit by one of our most distinguished philosophers of science, this wise book brings a much needed measure of clarity to current arguments about the nature of knowledge.
A theoretical analysis and historical investigation of the Cold War nuclear arms race that challenges the nuclear revolution.