Download Free Storage And Computation In The Language Faculty Book in PDF and EPUB Free Download. You can read online Storage And Computation In The Language Faculty and write the review.

Traditionally, computation - the rule-driven manipulation of symbols - as opposed to (lexical) storage, has been the main focus of research in the language faculty. There is, however, increasing evidence of a prominent role of storage. Constructions that could be computed not necessarily always are. In this volume, the relative roles of computation and storage are discussed, both theoretically and on the basis of linguistic, psycholinguistic, and brain-imaging evidence, with respect to a wide range of language phenomena, such as morphological processing, syntactic processing, limitations of parsing mechanisms, neural substrates of short-term storage versus computation, and the processing of discourse. Each chapter has been written by one or more outstanding experts in the field. The contributions are thorough, but at the same time free from unnecessary technical detail, so that the volume is accessible to experienced readers as well as students in linguistics, psychology, and other cognitive sciences.
Every now and again I receive a lengthy manuscript from a kind of theoretician known to psychiatrists as the "triangle people" - kooks who have independently discovered that everything in the universe comes in threes (solid , liquid, gas; protons, neutrons, electrons; the Father, the Son, the Holy Ghost ; Moe, Larry, Curly; and so on) . At the risk of sounding like a triangle person, let me explain why I think that the topic of this volume - - storage and computation in the language fac ulty - though having just two sides rather than three, is the key to understanding every interesting issue in the study of language. I will begin with the fundamental scientific problem in linguistics: explaining the vast expressive power of language. What is the trick behind our ability to filleach others' heads with so many different ideas? I submit there is not one trick but two, and they have been emphasized by different thinkers throughout the history of linguistics.
A proposal for a formal model, Fragment Grammars, that treats productivity and reuse as the target of inference in a probabilistic framework. Language allows us to express and comprehend an unbounded number of thoughts. This fundamental and much-celebrated property is made possible by a division of labor between a large inventory of stored items (e.g., affixes, words, idioms) and a computational system that productively combines these stored units on the fly to create a potentially unlimited array of new expressions. A language learner must discover a language's productive, reusable units and determine which computational processes can give rise to new expressions. But how does the learner differentiate between the reusable, generalizable units (for example, the affix -ness, as in coolness, orderliness, cheapness) and apparent units that do not actually generalize in practice (for example, -th, as in warmth but not coolth)? In this book, Timothy O'Donnell proposes a formal computational model, Fragment Grammars, to answer these questions. This model treats productivity and reuse as the target of inference in a probabilistic framework, asking how an optimal agent can make use of the distribution of forms in the linguistic input to learn the distribution of productive word-formation processes and reusable units in a given language. O'Donnell compares this model to a number of other theoretical and mathematical models, applying them to the English past tense and English derivational morphology, and showing that Fragment Grammars unifies a number of superficially distinct empirical phenomena in these domains and justifies certain seemingly ad hoc assumptions in earlier theories.
How does human language work? How do we put ideas into words that others can understand? Can linguistics shed light on the way the brain operates? Foundations of Language puts linguistics back at the centre of the search to understand human consciousness. Ray Jackendoff begins by surveying the developments in linguistics over the years since Noam Chomsky's Aspects of the Theory of Syntax. He goes on to propose a radical re-conception of how the brain processes language. This opens up vivid new perspectives on every major aspect of language and communication, including grammar, vocabulary, learning, the origins of human language, and how language relates to the real world. Foundations of Language makes important connections with other disciplines which have been isolated from linguistics for many years. It sets a new agenda for close cooperation between the study of language, mind, the brain, behaviour, and evolution.
This book represents the state of the art on rightward movement in one thematically coherent volume. It documents the growing importance of the combination of empirical and theoretical work in linguistic analysis. Several contributions argue that rightward movement is a means of reducing phonological or structural complexity. The inclusion of corpus data and psycholinguistic results confirms the Right Roof Constraint as a characteristic property of extraposition and argues for a reduced role of subsentential bounding nodes. The contributions also show that the phenomenon cannot be looked at from one module of grammar alone, but calls for an interaction of syntax, semantics, phonology, and discourse. The discussion of different languages such as English, German, Dutch, Italian, Italian Sign Language, Modern Greek, Uyghur, and Khalkha enhances our understanding of the complexity of the phenomenon. Finally, the analytic options of different frameworks are explored. The volume is of interest to students and researchers of syntax, semantics, psycholinguistics, and corpus linguistics.
How do infants and young children coordinate information in real time to arrive at sentence meaning from the words and structure of the sentence and from the nonlinguistic context? This volume introduces readers to an emerging field of research, experimental developmental psycholinguistics, and to the four predominant methodologies used to study on-line language processing in children. Authored by key figures in psycholinguistics, neuroscience and developmental psychology, the chapters cover event-related brain potentials, free-viewing eyetracking, looking-while-listening, and reaction-time techniques, also providing a historical backdrop for this line of research. Multiple aspects of experimental design, data collection and data analysis are addressed in detail, alongside surveys of recent important findings about how infants and children process sounds, words, and sentences. Indispensable for students and researchers working in the areas of language acquisition, developmental psychology and developmental neuroscience of language, this volume will also appeal to speech language pathologists and early childhood educators.
This volume offers a major reconceptualization of linguistic theory through the lens of morphology, crucially collapsing the distinction between the lexicon and the grammar. This approach accounts for both productive and non-productive morphological phenomena, and moreover integrates linguistic theory into psycholinguistics and human cognition.
Speakers and learners, based on memory and experience, implicitly know that certain language elements naturally pair together. However, they also understand, through abstract and frequency-independent categories, why some combinations are possible and others are not. The frequency-grammar interface (FGI) bridges these two types of information in human cognition. Due to this interface, the sediment of statistical calculations over the order, distribution, and associations of items (the regularities) and the computation over the abstract principles that allow these items to join together (the rules) are brought together in a speaker’s competence, feeding into one another and eventually becoming superposed. In this volume, it is argued that a specific subset of both first and second language grammar (termed ‘combinatorial grammar’) is both innate and learned. While not derived from language usage, combinatorial grammar is continuously recalibrated by usage throughout a speaker’s life. In the domain of combinatorial grammar, both generative and usage-based theories are correct, each shedding light on just one component of the two that are necessary for any language to function: rules and regularities.
The Handbook of Phonological Theory, second edition offers an innovative and detailed examination of recent developments in phonology, and the implications of these within linguistic theory and related disciplines. Revised from the ground-up for the second edition, the book is comprised almost entirely of newly-written and previously unpublished chapters Addresses the important questions in the field including learnability, phonological interfaces, tone, and variation, and assesses the findings and accomplishments in these domains Brings together a renowned and international contributor team Offers new and unique reflections on the advances in phonological theory since publication of the first edition in 1995 Along with the first edition, still in publication, it forms the most complete and current overview of the subject in print