Download Free Headedness And Or Grammatical Anarchy Book in PDF and EPUB Free Download. You can read online Headedness And Or Grammatical Anarchy and write the review.

In most grammatical models, hierarchical structuring and dependencies are considered as central features of grammatical structures, an idea which is usually captured by the notion of “head” or “headedness”. While in most models, this notion is more or less taken for granted, there is still much disagreement as to the precise properties of grammatical heads and the theoretical implications that arise of these properties. Moreover, there are quite a few linguistic structures that pose considerable challenges to the notion of “headedness”. Linking to the seminal discussions led in Zwicky (1985) and Corbett, Fraser, & Mc-Glashan (1993), this volume intends to look more closely upon phenomena that are considered problematic for an analysis in terms of grammatical heads. The aim of this book is to approach the concept of “headedness” from its margins. Thus, central questions of the volume relate to the nature of heads and the distinction between headed and non-headed structures, to the process of gaining and losing head status, and to the thought-provoking question as to whether grammar theory could do without heads at all. The contributions in this volume provide new empirical findings bearing on phenomena that challenge the conception of grammatical heads and/or discuss the notion of head/headedness and its consequences for grammatical theory in a more abstract way. The collected papers view the topic from diverse theoretical perspectives (among others HPSG, Generative Syntax, Optimality Theory) and different empirical angles, covering typological and corpus-linguistic accounts, with a focus on data from German.
Head-Driven Phrase Structure Grammar (HPSG) is a constraint-based or declarative approach to linguistic knowledge, which analyses all descriptive levels (phonology, morphology, syntax, semantics, pragmatics) with feature value pairs, structure sharing, and relational constraints. In syntax it assumes that expressions have a single relatively simple constituent structure. This volume provides a state-of-the-art introduction to the framework. Various chapters discuss basic assumptions and formal foundations, describe the evolution of the framework, and go into the details of the main syntactic phenomena. Further chapters are devoted to non-syntactic levels of description. The book also considers related fields and research areas (gesture, sign languages, computational linguistics) and includes chapters comparing HPSG with other frameworks (Lexical Functional Grammar, Categorial Grammar, Construction Grammar, Dependency Grammar, and Minimalism).
This book introduces formal grammar theories that play a role in current linguistic theorizing (Phrase Structure Grammar, Transformational Grammar/Government & Binding, Generalized Phrase Structure Grammar, Lexical Functional Grammar, Categorial Grammar, Head-​Driven Phrase Structure Grammar, Construction Grammar, Tree Adjoining Grammar). The key assumptions are explained and it is shown how the respective theory treats arguments and adjuncts, the active/passive alternation, local reorderings, verb placement, and fronting of constituents over long distances. The analyses are explained with German as the object language. The second part of the book compares these approaches with respect to their predictions regarding language acquisition and psycholinguistic plausibility. The nativism hypothesis, which assumes that humans posses genetically determined innate language-specific knowledge, is critically examined and alternative models of language acquisition are discussed. The second part then addresses controversial issues of current theory building such as the question of flat or binary branching structures being more appropriate, the question whether constructions should be treated on the phrasal or the lexical level, and the question whether abstract, non-visible entities should play a role in syntactic analyses. It is shown that the analyses suggested in the respective frameworks are often translatable into each other. The book closes with a chapter showing how properties common to all languages or to certain classes of languages can be captured.
This book is an introduction to the syntactic structures that can be found in the Germanic languages. The analyses are couched in the framework of HPSG light, which is a simplified version of HPSG that uses trees to depict analyses rather than complicated attribute value matrices. The book is written for students with basic knowledge about case, constituent tests, and simple phrase structure grammars (advanced BA or MA level) and for researchers with an interest in the Germanic languages and/or an interest in Head-Driven Phrase Structure Grammar/Sign-Based Construction Grammar without having the time to deal with all the details of these theories.
Lexical Functional Grammar (LFG) is a nontransformational theory of linguistic structure, first developed in the 1970s by Joan Bresnan and Ronald M. Kaplan, which assumes that language is best described and modeled by parallel structures representing different facets of linguistic organization and information, related by means of functional correspondences. This volume has five parts. Part I, Overview and Introduction, provides an introduction to core syntactic concepts and representations. Part II, Grammatical Phenomena, reviews LFG work on a range of grammatical phenomena or constructions. Part III, Grammatical modules and interfaces, provides an overview of LFG work on semantics, argument structure, prosody, information structure, and morphology. Part IV, Linguistic disciplines, reviews LFG work in the disciplines of historical linguistics, learnability, psycholinguistics, and second language learning. Part V, Formal and computational issues and applications, provides an overview of computational and formal properties of the theory, implementations, and computational work on parsing, translation, grammar induction, and treebanks. Part VI, Language families and regions, reviews LFG work on languages spoken in particular geographical areas or in particular language families. The final section, Comparing LFG with other linguistic theories, discusses LFG work in relation to other theoretical approaches.
Dan Everett is a renowned linguist with an unparalleled breadth of contributions, ranging from fieldwork to linguistic theory, including phonology, morphology, syntax, semantics, sociolinguistics, psycholinguistics, historical linguistics, philosophy of language, and philosophy of linguistics. Born on the U.S. Mexican border, Daniel Everett faced much adversity growing up and was sent as a missionary to convert the Pirahã in the Amazonian jungle, a group of people who speak a language that no outsider had been able to become proficient in. Although no Pirahã person was successfully converted, Everett successfully learned and studied Pirahã, as well as multiple other languages in the Americas. Ever steadfast in pursuing data-driven language science, Everett debunked generativist claims about syntactic recursion, for which he was repeatedly attacked. In addition to conducting fieldwork with many understudied languages and revolutionizing linguistics, Everett has published multiple works for the general public: "Don’t sleep, there are snakes, Language: The cultural tool, and how language began". This book is a collection of 15 articles that are related to Everett’s work over the years, released after a tribute event for Dan Everett that was held at MIT on June 8th 2023.
This book examines extractions out of the subject, which is traditionally considered to be an island for extraction. There is a debate among linguists regarding whether the “subject island constraint” is a syntactic phenomenon or an illusion caused by cognitive or pragmatic factors. The book focusses on French, that provides an interesting case study because it allows certain extractions out of the subject despite not being a typical null-subject language. The book takes a discourse-based approach and introduces the “Focus-Background Conflict” constraint, which posits that a focused element cannot be part of a backgrounded constituent due to a pragmatic contradiction. The major novelty of this proposal is that it predicts a distinction between extractions out of the subject in focalizing and non-focalizing constructions. The central contribution of this book is to offer the detailed results of a series of empirical studies (corpus studies and experiments) on extractions out of the subject is French. These studies offer evidence for the possibility of extraction out of the subject in French. But they also reveal a clear distinction between constructions. While extractions out of the subject are common and highly acceptable in relative clauses, this is not the case for interrogatives and clefts. Finally, the book proposes a Head-Driven Phrase Structure Grammar (HPSG) analysis of subject islands. It demonstrates the interaction between information structure and syntax using a representation of information structure based on Minimal Recursion Semantics (MRS).
Gisbert Fanselow’s work has been invaluable and inspiring to many ­researchers working on syntax, morphology, and information ­structure, both from a ­theoretical and from an experimental perspective. This ­volume comprises a collection of articles dedicated to Gisbert on the occasion of his 60th birthday, covering a range of topics from these areas and beyond. The contributions have in ­common that in a broad sense they have to do with language structures (and thus trees), and that in a more specific sense they have to do with birds. They thus cover two of Gisbert’s major interests in- and outside of the linguistic world (and ­perhaps even at the interface).
It is well-known that derivational affixes can be highly polysemous, producing a range of different, often related, meanings. For example, English deverbal nouns with the suffix -er can denote instruments (opener), agents (writer), locations (diner), or patients (loaner). It is commonly assumed that this polysemy arises through a compositional process in which the affix interacts with the semantics of the base. Yet, despite intensive research in recent years, a workable model for this interaction is still under debate. In order to study and model the semantic contributions of the base and of the affix, a framework is needed in which meanings can be composed and decomposed. In this book, I formalize the semantic input and output of derivation by means of frames, that is, recursive attribute-value structures that serve to model mental representations of concepts. In my approach, the input frame offers an array of semantic elements from which an affix may select to construct the derivative's meaning. The relationship between base and derivative is made explicit by integrating their respective frame-semantic representations into lexical rules and inheritance hierarchies. I apply this approach to a qualitative corpus study of the productive relationship between the English nominalizing suffix -ment and a semantically delimited set of verbal bases. My data set consists of 40 neologisms with base verbs from two semantic classes, namely change-of-state verbs and verbs of psychological state. I analyze 369 attestations which were elicited from various corpora with a purposeful sampling approach, and which were hand-coded using common semantic categories such as event, state, patient and stimulus. My results show that -ment can target a systematically restricted set of elements in the frame of a given base verb. It thereby produces a range of possible readings in each derivative, which becomes ultimately interpretable only within a specific context. The derivational process is governed by an interaction of the semantic elements provided by the base on the one hand, with properties of the affix (e.g. -ment's aversion to [+animate] readings) on the other. For instance, a shift from the verb annoy to a result-state reading in annoyment is possible because the input frame of verbs of psychological state offers a RESULT-STATE attribute, which, as is fixed in the inheritance hierarchy, is compatible with -ment. Meanwhile, a shift from annoy to an experiencer reading in annoyment fails because the value range of the attribute EXPERIENER is fixed to [+animate] entities, so that -ment's animacy constraint blocks the inheritance mechanism. Furthermore, a quantitative exploration of my data set reveals a likely blocking effect for some -ment readings. Thus, while I have found most expected combinations of nominalization and reading attested, there are pronounced gaps for readings like instrument or stimulus. Such readings are likely to be produced by standardly subject-denoting suffixes such as -er or -ant, which may reduce the probability for -ment derivation. The quantitative analysis furthermore shows that, within the subset of attested combinations, ambiguity is widespread, with 43% of all combinations of nominalization and reading being only attested ambiguously. This book shows how a derivational process acts on the semantics of a given verbal base by reporting on an in-depth qualitative study of the semantic contributions of both the base and the affix. Furthermore, it demonstrates that an explicit semantic decomposition of the base is essential for the analysis of the resulting derivative's semantics.