Download Free Computing The News Book in PDF and EPUB Free Download. You can read online Computing The News and write the review.

Faced with a full-blown crisis, a growing number of journalists are engaging in seemingly unjournalistic practices such as creating and maintaining databases, handling algorithms, or designing online applications. “Data journalists” claim that these approaches help the profession demonstrate greater objectivity and fulfill its democratic mission. In their view, computational methods enable journalists to better inform their readers, more closely monitor those in power, and offer deeper analysis. In Computing the News, Sylvain Parasie examines how data journalists and news organizations have navigated the tensions between traditional journalistic values and new technologies. He traces the history of journalistic hopes for computing technology and contextualizes the surge of data journalism in the twenty-first century. By importing computational techniques and ways of knowing new to journalism, news organizations have come to depend on a broader array of human and nonhuman actors. Parasie draws on extensive fieldwork in the United States and France, including interviews with journalists and data scientists as well as a behind-the-scenes look at several acclaimed projects in both countries. Ultimately, he argues, fulfilling the promise of data journalism requires the renewal of journalistic standards and ethics. Offering an in-depth analysis of how computing has become part of the daily practices of journalists, this book proposes ways for journalism to evolve in order to serve democratic societies.
This book examines the growing importance of algorithms and automation—including emerging forms of artificial intelligence—in the gathering, composition, and distribution of news. In it the authors connect a long line of research on journalism and computation with scholarly and professional terrain yet to be explored. Taken as a whole, these chapters share some of the noble ambitions of the pioneering publications on ‘reporting algorithms’, such as a desire to see computing help journalists in their watchdog role by holding power to account. However, they also go further, firstly by addressing the fuller range of technologies that computational journalism now consists of: from chatbots and recommender systems to artificial intelligence and atomised journalism. Secondly, they advance the literature by demonstrating the increased variety of uses for these technologies, including engaging underserved audiences, selling subscriptions, and recombining and re-using content. Thirdly, they problematise computational journalism by, for example, pointing out some of the challenges inherent in applying artificial intelligence to investigative journalism and in trying to preserve public service values. Fourthly, they offer suggestions for future research and practice, including by presenting a framework for developing democratic news recommenders and another that may help us think about computational journalism in a more integrated, structured manner. The chapters in this book were originally published as a special issue of Digital Journalism.
From hidden connections in big data to bots spreading fake news, journalism is increasingly computer-generated. Nicholas Diakopoulos explains the present and future of a world in which algorithms have changed how the news is created, disseminated, and received, and he shows why journalists—and their values—are at little risk of being replaced.
This easy-to-follow introduction to computer science reveals how familiar stories like Hansel and Gretel, Sherlock Holmes, and Harry Potter illustrate the concepts and everyday relevance of computing. Picture a computer scientist, staring at a screen and clicking away frantically on a keyboard, hacking into a system, or perhaps developing an app. Now delete that picture. In Once Upon an Algorithm, Martin Erwig explains computation as something that takes place beyond electronic computers, and computer science as the study of systematic problem solving. Erwig points out that many daily activities involve problem solving. Getting up in the morning, for example: You get up, take a shower, get dressed, eat breakfast. This simple daily routine solves a recurring problem through a series of well-defined steps. In computer science, such a routine is called an algorithm. Erwig illustrates a series of concepts in computing with examples from daily life and familiar stories. Hansel and Gretel, for example, execute an algorithm to get home from the forest. The movie Groundhog Day illustrates the problem of unsolvability; Sherlock Holmes manipulates data structures when solving a crime; the magic in Harry Potter’s world is understood through types and abstraction; and Indiana Jones demonstrates the complexity of searching. Along the way, Erwig also discusses representations and different ways to organize data; “intractable” problems; language, syntax, and ambiguity; control structures, loops, and the halting problem; different forms of recursion; and rules for finding errors in algorithms. This engaging book explains computation accessibly and shows its relevance to daily life. Something to think about next time we execute the algorithm of getting up in the morning.
From hidden connections in big data to bots spreading fake news, journalism is increasingly computer-generated. Nicholas Diakopoulos explains the present and future of a world in which algorithms have changed how the news is created, disseminated, and received, and he shows why journalists--and their values--are at little risk of being replaced.
Reporters in the newsroom are becoming more involved in computer-assisted reporting and online news research than ever before. This edition introduces readers to computer-assisted reporting and to describe how leading journalists are using personal computers for news gathering in modern print, broadcast, and online newsrooms. It provides a thorough discussion of technology and its applications to news reporting. Computer Assisted Reporting focuses on the computerization of newsgathering, highlighting the fact that the computer assists journalists by making writing easier, and also makes gathering and organizing information more efficient. As it begins, the book demonstrates methods for journalists to get more from their computers, such as data retrieval, data analysis, information storage, and dissemination of that information in both processed and unprocessed forms. It concludes with a refined proposal, originally proposed in the first edition, for five stages for development of computer literacy in the newsroom.
Discover the history of computing through 4 major threads of development in this compact, accessible history covering punch cards, Silicon Valley, smartphones, and much more. In an accessible style, computer historian Paul Ceruzzi offers a broad though detailed history of computing, from the first use of the word “digital” in 1942 to the development of punch cards and the first general purpose computer, to the internet, Silicon Valley, and smartphones and social networking. Ceruzzi identifies 4 major threads that run throughout all of computing’s technological development: • Digitization: the coding of information, computation, and control in binary form • The convergence of multiple streams of techniques, devices, and machines • The steady advance of electronic technology, as characterized famously by “Moore's Law” • Human-machine interface The history of computing could be told as the story of hardware and software, or the story of the Internet, or the story of “smart” hand-held devices. In this concise and accessible account of the invention and development of digital technology, Ceruzzi offers a general and more useful perspective for students of computer science and history.
An exploration of a new division of labor between machines and humans, in which people provide value to the economy with little or no compensation. The computerization of the economy—and everyday life—has transformed the division of labor between humans and machines, shifting many people into work that is hidden, poorly compensated, or accepted as part of being a “user” of digital technology. Through our clicks and swipes, logins and profiles, emails and posts, we are, more or less willingly, participating in digital activities that yield economic value to others but little or no return to us. Hamid Ekbia and Bonnie Nardi call this kind of participation—the extraction of economic value from low-cost or free labor in computer-mediated networks—“heteromation.” In this book, they explore the social and technological processes through which economic value is extracted from digitally mediated work, the nature of the value created, and what prompts people to participate in the process. Arguing that heteromation is a new logic of capital accumulation, Ekbia and Nardi consider different kinds of heteromated labor: communicative labor, seen in user-generated content on social media; cognitive labor, including microwork and self-service; creative labor, from gaming environments to literary productions; emotional labor, often hidden within paid jobs; and organizing labor, made up of collaborative groups such as citizen scientists. Ekbia and Nardi then offer a utopian vision: heteromation refigured to bring end users more fully into the prosperity of capitalism.
Why do computers use so much energy? What are the fundamental physical laws governing the relationship between the precise computation run by a system, whether artificial or natural, and how much energy that computation requires? This volume integrates concepts from diverse fields, cultivating a modern, nonequilibrium thermodynamics of computation.