Download Free Translation Brains And The Computer Book in PDF and EPUB Free Download. You can read online Translation Brains And The Computer and write the review.

This book is about machine translation (MT) and the classic problems associated with this language technology. It examines the causes of these problems and, for linguistic, rule-based systems, attributes the cause to language’s ambiguity and complexity and their interplay in logic-driven processes. For non-linguistic, data-driven systems, the book attributes translation shortcomings to the very lack of linguistics. It then proposes a demonstrable way to relieve these drawbacks in the shape of a working translation model (Logos Model) that has taken its inspiration from key assumptions about psycholinguistic and neurolinguistic function. The book suggests that this brain-based mechanism is effective precisely because it bridges both linguistically driven and data-driven methodologies. It shows how simulation of this cerebral mechanism has freed this one MT model from the all-important, classic problem of complexity when coping with the ambiguities of language. Logos Model accomplishes this by a data-driven process that does not sacrifice linguistic knowledge, but that, like the brain, integrates linguistics within a data-driven process. As a consequence, the book suggests that the brain-like mechanism embedded in this model has the potential to contribute to further advances in machine translation in all its technological instantiations.
This book represents the views of one of the greatest mathematicians of the twentieth century on the analogies between computing machines and the living human brain. John von Neumann concludes that the brain operates in part digitally, in part analogically, but uses a peculiar statistical language unlike that employed in the operation of man-made computers. This edition includes a new foreword by two eminent figures in the fields of philosophy, neuroscience, and consciousness.
This book assembles fifteen original, interdisciplinary research chapters that explore methodological and conceptual considerations as well as user and usage studies to elucidate the relation between the translation product and translation/post-editing processes. It introduces numerous innovative empirical/data-driven measures as well as novel classification schemes and taxonomies to investigate and quantify the relation between translation quality and translation effort in from-scratch translation, machine translation post-editing and computer-assisted audiovisual translation. The volume addresses questions in the translation of cognates, neologisms, metaphors, and idioms, as well as figurative and cultural specific expressions. It re-assesses the notion of translation universals and translation literality, elaborates on the definition of translation units and syntactic equivalence, and investigates the impact of translation ambiguity and translation entropy. The results and findings are interpreted in the context of psycho-linguistic models of bilingualism and re-frame empirical translation process research within the context of modern dynamic cognitive theories of the mind. The volume bridges the gap between translation process research and machine translation research. It appeals to students and researchers in the fields.
The dream of automatic language translation is now closer thanks to recent advances in the techniques that underpin statistical machine translation. This class-tested textbook from an active researcher in the field, provides a clear and careful introduction to the latest methods and explains how to build machine translation systems for any two languages. It introduces the subject's building blocks from linguistics and probability, then covers the major models for machine translation: word-based, phrase-based, and tree-based, as well as machine translation evaluation, language modeling, discriminative training and advanced methods to integrate linguistic annotation. The book also reports the latest research, presents the major outstanding challenges, and enables novices as well as experienced researchers to make novel contributions to this exciting area. Ideal for students at undergraduate and graduate level, or for anyone interested in the latest developments in machine translation.
This is the first volume that brings together research and practice from academic and industry settings and a combination of human and machine translation evaluation. Its comprehensive collection of papers by leading experts in human and machine translation quality and evaluation who situate current developments and chart future trends fills a clear gap in the literature. This is critical to the successful integration of translation technologies in the industry today, where the lines between human and machine are becoming increasingly blurred by technology: this affects the whole translation landscape, from students and trainers to project managers and professionals, including in-house and freelance translators, as well as, of course, translation scholars and researchers. The editors have broad experience in translation quality evaluation research, including investigations into professional practice with qualitative and quantitative studies, and the contributors are leading experts in their respective fields, providing a unique set of complementary perspectives on human and machine translation quality and evaluation, combining theoretical and applied approaches.
What Is BCI2000? BCI2000 is a general-purpose software platform for brain–computer interface (BCI) research. It can also be used for a wide variety of data acquisition, stimulus p- sentation, and brain monitoring applications. BCI2000 has been in development since 2000 in a project led by the Brain–Computer Interface R&D Program at the Wadsworth Center of the New York State Department of Health in Albany, New York, USA, with substantial contributions by the Institute of Medical Psychology and Behavioral Neurobiology at the University of Tübingen, Germany. In addition, many laboratories around the world, most notably the BrainLab at Georgia State University in Atlanta, Georgia, and Fondazione Santa Lucia in Rome, Italy, have also played an important role in the project’s development. Mission The mission of the BCI2000 project is to facilitate research and the development of applications in all areas that depend on real-time acquisition, processing, and feedback of biosignals. Vision Our vision is that BCI2000 will become a widely used software tool for diverse areas of research and development.
This volume extends and deepens our understanding of Translator Studies by charting new territory in terms of theory, methods and concepts. The focus is on literary translators, their roles, identities, and personalities. The book introduces pertinent translator-centered approaches in four sections: historical-biographical studies, social-scientific and process-oriented methods, and approaches that use paratexts or translations to study literary translators. Drawing on a variety of concepts, such as identity, role, self, posture, habitus, and voice, the various chapters showcase forgotten literary translators and shed new light on some well-known figures; they examine literary translators not as functioning units but as human beings in their uniqueness. Literary Translator Studies as a subdiscipline of Translation Studies demonstrates how exploring the cultural, social, psychological, and cognitive facets of translatorial subjects contributes to a holistic understanding of translation.
In the world of academia, scholars and researchers are confronted with a rapidly expanding knowledge base in Artificial Intelligence (AI) and nanotechnology. The integration of these two groundbreaking fields presents an intricate web of concepts, innovations, and interdisciplinary applications that can overwhelm even the most astute academic minds. Staying up to date with the latest developments and effectively navigating this complex terrain has become a pressing challenge for those striving to contribute meaningfully to these fields. Artificial Intelligence in the Age of Nanotechnology is a transformative solution meticulously crafted to address the academic community's knowledge gaps and challenges. This comprehensive book serves as the guiding light for scholars, researchers, and students grappling with the dynamic synergy between AI and Nanotechnology. It offers a structured and authoritative exploration of the core principles and transformative applications of these domains across diverse fields. By providing clarity and depth, it empowers academics to stay at the forefront of innovation and make informed contributions.
The Routledge Handbook of Translation and Technology provides a comprehensive and accessible overview of the dynamically evolving relationship between translation and technology. Divided into five parts, with an editor's introduction, this volume presents the perspectives of users of translation technologies, and of researchers concerned with issues arising from the increasing interdependency between translation and technology. The chapters in this Handbook tackle the advent of technologization at both a technical and a philosophical level, based on industry practice and academic research. Containing over 30 authoritative, cutting-edge chapters, this is an essential reference and resource for those studying and researching translation and technology. The volume will also be valuable for translators, computational linguists and developers of translation tools.
The Routledge Encyclopedia of Translation Technology provides a state-of-the art survey of the field of computer-assisted translation. It is the first definitive reference to provide a comprehensive overview of the general, regional and topical aspects of this increasingly significant area of study. The Encyclopedia is divided into three parts: Part One presents general issues in translation technology, such as its history and development, translator training and various aspects of machine translation, including a valuable case study of its teaching at a major university; Part Two discusses national and regional developments in translation technology, offering contributions covering the crucial territories of China, Canada, France, Hong Kong, Japan, South Africa, Taiwan, the Netherlands and Belgium, the United Kingdom and the United States Part Three evaluates specific matters in translation technology, with entries focused on subjects such as alignment, bitext, computational lexicography, corpus, editing, online translation, subtitling and technology and translation management systems. The Routledge Encyclopedia of Translation Technology draws on the expertise of over fifty contributors from around the world and an international panel of consultant editors to provide a selection of articles on the most pertinent topics in the discipline. All the articles are self-contained, extensively cross-referenced, and include useful and up-to-date references and information for further reading. It will be an invaluable reference work for anyone with a professional or academic interest in the subject.