Download Free From Whirlwind To Mitre Book in PDF and EPUB Free Download. You can read online From Whirlwind To Mitre and write the review.

The book shows how the wartime alliance of engineers, scientists, and the military exemplified by MIT's Radiation Lab helped to transform research and development practice in the United States through the end of the Cold War period. This book presents an organizational and social history of one of the foundational projects of the computer era: the development of the SAGE (Semi-Automatic Ground Environment) air defense system, from its first test at Bedford, Massachusetts, in 1951, to the installation of the first unit of the New York Air Defense Sector of the SAGE system, in 1958. The idea for SAGE grew out of Project Whirlwind, a wartime computer development effort, when the U.S. Department of Defense realized that the Whirlwind computer might anchor a continent-wide advance warning system. Developed by MIT engineers and scientists for the U.S. Air Force, SAGE monitored North American skies for possible attack by manned aircraft and missiles for twenty-five years. Aside from its strategic importance, SAGE set the foundation for mass data-processing systems and foreshadowed many computer developments of the 1960s. The heart of the system, the AN/FSQ-7, was the first computer to have an internal memory composed of "magnetic cores," thousands of tiny ferrite rings that served as reversible electromagnets. SAGE also introduced computer-driven displays, online terminals, time sharing, high-reliability computation, digital signal processing, digital transmission over telephone lines, digital track-while-scan, digital simulation, computer networking, and duplex computing. The book shows how the wartime alliance of engineers, scientists, and the military exemplified by MIT's Radiation Lab helped to transform research and development practice in the United States through the end of the Cold War period.
How differing assessments of risk by physicists and computer scientists have influenced public debate over nuclear defense. In a rapidly changing world, we rely upon experts to assess the promise and risks of new technology. But how do these experts make sense of a highly uncertain future? In Arguments that Count, Rebecca Slayton offers an important new perspective. Drawing on new historical documents and interviews as well as perspectives in science and technology studies, she provides an original account of how scientists came to terms with the unprecedented threat of nuclear-armed intercontinental ballistic missiles (ICBMs). She compares how two different professional communities—physicists and computer scientists—constructed arguments about the risks of missile defense, and how these arguments changed over time. Slayton shows that our understanding of technological risks is shaped by disciplinary repertoires—the codified knowledge and mathematical rules that experts use to frame new challenges. And, significantly, a new repertoire can bring long-neglected risks into clear view. In the 1950s, scientists recognized that high-speed computers would be needed to cope with the unprecedented speed of ICBMs. But the nation's elite science advisors had no way to analyze the risks of computers so used physics to assess what they could: radar and missile performance. Only decades later, after establishing computing as a science, were advisors able to analyze authoritatively the risks associated with complex software—most notably, the risk of a catastrophic failure. As we continue to confront new threats, including that of cyber attack, Slayton offers valuable insight into how different kinds of expertise can limit or expand our capacity to address novel technological risks.
"Both ERA and EMCC had their roots in World War II, and in postwar years both firms received major funding from the United States government. Norberg analyzes the interaction between the two companies and the government and examines the impact of this institutional context on technological innovation. He looks at the two firms' operations after 1951 as independent subsidiaries of Remington Rand, and documents the management problems that began after Remington Rand merged with Sperry Gyroscope to form Sperry Rand in 1955"--Jacket.
Praise for the previous edition: "Entries are written with enough clarity and simplicity to appeal to general audiences. The additional readings that end each profile give excellent pointers for more detailed information...Recommended."—Choice "This well-written collection of biographies of the most important contributors to the computer world...is a valuable resource for those interested in the men and women who were instrumental in making the world we live in today. This is a recommended purchase for reference collections."—American Reference Books Annual "...this one is recommended for high-school, public, and undergraduate libraries."—Booklist The significant role that the computer plays in the business world, schools, and homes speaks to the impact it has on our daily lives. While many people are familiar with the Internet, online shopping, and basic computer technology, the scientists who pioneered this digital age are generally less well-known. A to Z of Computer Scientists, Updated Edition features 136 computer pioneers and shows the ways in which these individuals developed their ideas, overcame technical and institutional challenges, collaborated with colleagues, and created products or institutions of lasting importance. The cutting-edge, contemporary entries explore a diverse group of inventors, scientists, entrepreneurs, and visionaries in the computer science field. People covered include: Grace Hopper (1906–1992) Dennis Ritchie (1941–2011) Brian Kernighan (1942–present) Howard Rheingold (1947–present) Bjarne Stroustrup (1950–present) Esther Dyson (1951–present) Silvio Micali (1954–present) Jeff Bezos (1964–present) Pierre Omidyar (1967–present) Jerry Yang (1968–present)
Are nuclear arsenals safe from cyber-attack? Could terrorists launch a nuclear weapon through hacking? Are we standing at the edge of a major technological challenge to global nuclear order? These are among the many pressing security questions addressed in Andrew Futter’s ground-breaking study of the cyber threat to nuclear weapons. Hacking the Bomb provides the first ever comprehensive assessment of this worrying and little-understood strategic development, and it explains how myriad new cyber challenges will impact the way that the world thinks about and manages the ultimate weapon. The book cuts through the hype surrounding the cyber phenomenon and provides a framework through which to understand and proactively address the implications of the emerging cyber-nuclear nexus. It does this by tracing the cyber challenge right across the nuclear weapons enterprise, explains the important differences between types of cyber threats, and unpacks how cyber capabilities will impact strategic thinking, nuclear balances, deterrence thinking, and crisis management. The book makes the case for restraint in the cyber realm when it comes to nuclear weapons given the considerable risks of commingling weapons of mass disruption with weapons of mass destruction, and argues against establishing a dangerous norm of “hacking the bomb.” This timely book provides a starting point for an essential discussion about the challenges associated with the cyber-nuclear nexus, and will be of great interest to scholars and students of security studies as well as defense practitioners and policy makers.
Accounts of the early events of the computing industry—the Turing machine, the massive Colossus, the ENIAC computer—are well-told tales, and equally well known is the later emergence of Silicon Valley and the rise of the personal computer. Yet there is an extraordinary untold middle history—with deep roots in Minnesota. From the end of World War II through the 1970s, Minnesota was home to the first computing-centered industrial district in the world. Drawing on rare archival documents, photographs, and a wealth of oral histories, Digital State unveils the remarkable story of computer development in the heartland after World War II. These decades found corporations—concentrated in large part in Minnesota—designing state-of-the-art mainframe technologies, revolutionizing new methods of magnetic data storage, and, for the first time, truly integrating software and hardware into valuable products for the American government and public. Minnesota-based companies such as Engineering Research Associates, Univac, Control Data, Cray Research, Honeywell, and IBM Rochester were major international players and together formed an unrivaled epicenter advancing digital technologies. These companies not only brought vibrant economic growth to Minnesota, they nurtured the state’s present-day medical device and software industries and possibly even tomorrow’s nanotechnology. Thomas J. Misa’s groundbreaking history shows how Minnesota recognized and embraced the coming information age through its leading-edge companies, its workforce, and its prominent institutions. Digital State reveals the inner workings of the birth of the digital age in Minnesota and what we can learn from this era of sustained innovation.
A cultural theory of the interface as a relation that is both ubiquitous and elusive, drawing on disciplines from cultural theory to architecture. In this book, Branden Hookway considers the interface not as technology but as a form of relationship with technology. The interface, Hookway proposes, is at once ubiquitous and hidden from view. It is both the bottleneck through which our relationship to technology must pass and a productive encounter embedded within the use of technology. It is a site of contestation—between human and machine, between the material and the social, between the political and the technological—that both defines and elides differences. A virtuoso in multiple disciplines, Hookway offers a theory of the interface that draws on cultural theory, political theory, philosophy, art, architecture, new media, and the history of science and technology. He argues that the theoretical mechanism of the interface offers a powerful approach to questions of the human relationship to technology. Hookway finds the origin of the term interface in nineteenth-century fluid dynamics and traces its migration to thermodynamics, information theory, and cybernetics. He discusses issues of subject formation, agency, power, and control, within contexts that include technology, politics, and the social role of games. He considers the technological augmentation of humans and the human-machine system, discussing notions of embodied intelligence. Hookway views the figure of the subject as both receiver and active producer in processes of subjectification. The interface, he argues, stands in a relation both alien and intimate, vertiginous and orienting to those who cross its threshold.
Silicon Valley gets all the credit for digital creativity, but this account of the pre-PC world, when computing meant more than using mature consumer technology, challenges that triumphalism. The invention of the personal computer liberated users from corporate mainframes and brought computing into homes. But throughout the 1960s and 1970s a diverse group of teachers and students working together on academic computing systems conducted many of the activities we now recognize as personal and social computing. Their networks were centered in New Hampshire, Minnesota, and Illinois, but they connected far-flung users. Joy Rankin draws on detailed records to explore how users exchanged messages, programmed music and poems, fostered communities, and developed computer games like The Oregon Trail. These unsung pioneers helped shape our digital world, just as much as the inventors, garage hobbyists, and eccentric billionaires of Palo Alto. By imagining computing as an interactive commons, the early denizens of the digital realm seeded today’s debate about whether the internet should be a public utility and laid the groundwork for the concept of net neutrality. Rankin offers a radical precedent for a more democratic digital culture, and new models for the next generation of activists, educators, coders, and makers.
Today, we associate the relationship between feedback, control, and computing with Norbert Wiener's 1948 formulation of cybernetics. But the theoretical and practical foundations for cybernetics, control engineering, and digital computing were laid earlier, between the two world wars. In Between Human and Machine: Feedback, Control, and Computing before Cybernetics, David A. Mindell shows how the modern sciences of systems emerged from disparate engineering cultures and their convergence during World War II. Mindell examines four different arenas of control systems research in the United States between the world wars: naval fire control, the Sperry Gyroscope Company, the Bell Telephone Laboratories, and Vannevar Bush's laboratory at MIT. Each of these institutional sites had unique technical problems, organizational imperatives, and working environments, and each fostered a distinct engineering culture. Each also developed technologies to represent the world in a machine. At the beginning of World War II, President Roosevelt established the National Defense Research Committee, one division of which was devoted to control systems. Mindell shows how the NDRC brought together representatives from the four pre-war engineering cultures, and how its projects synthesized conceptions of control, communications, and computing. By the time Wiener articulated his vision, these ideas were already suffusing through engineering. They would profoundly influence the digital world. As a new way to conceptualize the history of computing, this book will be of great interest to historians of science, technology, and culture, as well as computer scientists and theorists. Between Human and Machine: Feedback, Control, and Computing before Cybernetics
In The third volume of The Digital Hand, James W. Cortada completes his sweeping survey of the effect of computers on American industry, turning finally to the public sector, and examining how computers have fundamentally changed the nature of work in government and education. This book goes far beyond generalizations about the Information Age to the specifics of how industries have functioned, now function, and will function in the years to come. Cortada combines detailed analysis with narrative history to provide a broad overview of computings and telecommunications role in the entire public sector, including federal, state, and local governments, and in K-12 and higher education. Beginning in 1950, when commercial applications of digital technology began to appear, Cortada examines the unique ways different public sector industries adopted new technologies, showcasing the manner in which their innovative applications influenced other industries, as well as the U.S. economy as a whole. He builds on the surveys presented in the first volume of the series, which examined sixteen manufacturing, process, transportation, wholesale and retail industries, and the second volume, which examined over a dozen financial, telecommunications, media, and entertainment industries. With this third volume, The Digital Hand trilogy is complete, and forms the most comprehensive and rigorously researched history of computing in business since 1950, providing a detailed picture of what the infrastructure of the Information Age really looks like and how we got there. Managers, historians, economists, and those working in the public sector will appreciate Cortada's analysis of digital technology's many roles and future possibilities.