Download Free Categorial Grammars And Natural Language Structures Book in PDF and EPUB Free Download. You can read online Categorial Grammars And Natural Language Structures and write the review.

For the most part, the papers collected in this volume stern from presentations given at a conference held in Tucson over the weekend of May 31 through June 2, 1985. We wish to record our gratitude to the participants in that conference, as well as to the National Science Foundation (Grant No. BNS-8418916) and the University of Arizona SBS Research Institute for their financial support. The advice we received from Susan Steele on organizational matters proved invaluable and had many felicitous consequences for the success of the con ference. We also would like to thank the staff of the Departments of Linguistics of the University of Arizona and the University of Massachusetts at Amherst for their help, as weIl as a number of individuals, including Lin Hall, Kathy Todd, and Jiazhen Hu, Sandra Fulmer, Maria Sandoval, Natsuko Tsujimura, Stuart Davis, Mark Lewis, Robin Schafer, Shi Zhang, Olivia Oehrle-Steele, and Paul Saka. Finally, we would like to express our gratitude to Martin Scrivener, our editor, for his patience and his encouragement. Vll INTRODUCTION The term 'categorial grammar' was introduced by Bar-Rillel (1964, page 99) as a handy way of grouping together some of his own earlier work (1953) and the work of the Polish logicians and philosophers Lesniewski (1929) and Ajdukiewicz (1935), in contrast to approaches to linguistic analysis based on phrase structure grammars.
For the most part, the papers collected in this volume stern from presentations given at a conference held in Tucson over the weekend of May 31 through June 2, 1985. We wish to record our gratitude to the participants in that conference, as well as to the National Science Foundation (Grant No. BNS-8418916) and the University of Arizona SBS Research Institute for their financial support. The advice we received from Susan Steele on organizational matters proved invaluable and had many felicitous consequences for the success of the con­ ference. We also would like to thank the staff of the Departments of Linguistics of the University of Arizona and the University of Massachusetts at Amherst for their help, as weIl as a number of individuals, including Lin Hall, Kathy Todd, and Jiazhen Hu, Sandra Fulmer, Maria Sandoval, Natsuko Tsujimura, Stuart Davis, Mark Lewis, Robin Schafer, Shi Zhang, Olivia Oehrle-Steele, and Paul Saka. Finally, we would like to express our gratitude to Martin Scrivener, our editor, for his patience and his encouragement. Vll INTRODUCTION The term 'categorial grammar' was introduced by Bar-Rillel (1964, page 99) as a handy way of grouping together some of his own earlier work (1953) and the work of the Polish logicians and philosophers Lesniewski (1929) and Ajdukiewicz (1935), in contrast to approaches to linguistic analysis based on phrase structure grammars.
This book is devoted to the mathematical foundations of categorial grammar including type-theoretic foundations of mathematics, grammatical categories and other topics related to categorial grammar and to philosophical and linguistic applications of this framework. The volume consists of three parts. The first, introductory part, contains the editor's addresses and two survey chapters concerning the history (W. Marciszewski) and current trends of the discipline (J.van Benthem). The second part consists of 10 chapters devoted to categorial grammar proper, and the third part 7 chapters devoted to areas close to categorial grammar. Most of the contributions are original papers, but five of them are reprints of classics (M.J. Cresswell, P.T. Geach, H. Hiz, J. Lambek, T. Potts).
Ever since Chomsky laid the framework for a mathematically formal theory of syntax, two classes of formal models have held wide appeal. The finite state model offered simplicity. At the opposite extreme numerous very powerful models, most notable transformational grammar, offered generality. As soon as this mathematical framework was laid, devastating arguments were given by Chomsky and others indicating that the finite state model was woefully inadequate for the syntax of natural language. In response, the completely general transformational grammar model was advanced as a suitable vehicle for capturing the description of natural language syntax. While transformational grammar seems likely to be adequate to the task, many researchers have advanced the argument that it is "too adequate. " A now classic result of Peters and Ritchie shows that the model of transformational grammar given in Chomsky's Aspects [IJ is powerful indeed. So powerful as to allow it to describe any recursively enumerable set. In other words it can describe the syntax of any language that is describable by any algorithmic process whatsoever. This situation led many researchers to reasses the claim that natural languages are included in the class of transformational grammar languages. The conclu sion that many reached is that the claim is void of content, since, in their view, it says little more than that natural language syntax is doable algo rithmically and, in the framework of modern linguistics, psychology or neuroscience, that is axiomatic.
This book covers topics in formal linguistics, intonational phonology, computational linguistics, and experimental psycholinguistics, presenting them as an integrated theory of the language faculty. In this book Mark Steedman argues that the surface syntax of natural languages maps spoken and written forms directly to a compositional semantic representation that includes predicate-argument structure, quantification, and information structure without constructing any intervening structural representation. His purpose is to construct a principled theory of natural grammar that is directly compatible with both explanatory linguistic accounts of a number of problematic syntactic phenomena and a straightforward computational account of the way sentences are mapped onto representations of meaning. The radical nature of Steedman's proposal stems from his claim that much of the apparent complexity of syntax, prosody, and processing follows from the lexical specification of the grammar and from the involvement of a small number of universal rule-types for combining predicates and arguments. These syntactic operations are related to the combinators of Combinatory Logic, engendering a much freer definition of derivational constituency than is traditionally assumed. This property allows Combinatory Categorial Grammar to capture elegantly the structure and interpretation of coordination and intonation contour in English as well as some well-known interactions between word order, coordination, and relativization across a number of other languages. It also allows more direct compatibility with incremental semantic interpretation during parsing. The book covers topics in formal linguistics, intonational phonology, computational linguistics, and experimental psycholinguistics, presenting them as an integrated theory of the language faculty in a form accessible to readers from any of those fields.
presupposition fails, we now give a short introduction into Unification Grammar. Since all implementations discussed in this volume use PROLOG (with the exception of BlockjHaugeneder), we felt that it would also be useful to explain the difference between unification in PROLOG and in UG. After the introduction to UG we briefly summarize the main arguments for using linguistic theories in natural language processing. We conclude with a short summary of the contributions to this volume. UNIFICATION GRAMMAR 3 Feature Structures or Complex Categories. Unification Grammar was developed by Martin Kay (Kay 1979). Martin Kay wanted to give a precise defmition (and implementation) of the notion of 'feature'. Linguists use features at nearly all levels of linguistic description. In phonetics, for instance, the phoneme b is usually described with the features 'bilabial', 'voiced' and 'nasal'. In the case of b the first two features get the value +, the third (nasal) gets the value -. Feature value pairs in phonology are normally represented as a matrix. bilabial: + voiced: + I nasal: - [Feature matrix for b.] In syntax features are used, for example, to distinguish different noun classes. The Latin noun 'murus' would be characterized by the following feature-value pairs: gender: masculin, number: singular, case: nominative, pred: murus. Besides a matrix representation one frequently fmds a graph representation for feature value pairs. The edges of the graph are labelled by features. The leaves denote the value of a feature.
Even though the range of phenomena syntactic theories intend to account for is basically the same, the large number of current approaches to syntax shows how differently these phenomena can be interpreted, described, and explained. The goal of the volume is to probe into the question of how exactly these frameworks differ and what if anything they have in common. Descriptions of a sample of current approaches to syntax are presented by their major practitioners (Part I) followed by their metatheoretical underpinnings (Part II). Given that the goal is to facilitate a systematic comparison among the approaches, a checklist of issues was given to the contributors to address. The main headings are Data, Goals, Descriptive Tools, and Criteria for Evaluation. The chapters are structured uniformly allowing an item-by-item survey across the frameworks. The introduction lays out the parameters along which syntactic frameworks must be the same and how they may differ and a final paper draws some conclusions about similarities and differences. The volume is of interest to descriptive linguists, theoreticians of grammar, philosophers of science, and studies of the cognitive science of science.
The book examines to what extent the mediating relation between constituents and their semantics can arise from combinatory knowledge of words. It traces the roots of Combinatory Categorial Grammar, and uses the theory to promote a Humean question in linguistics and cognitive science: Why do we see limited constituency and dependency in natural languages, despite their diversity and potential infinity? A potential answer is that constituents and dependencies might have arisen from a single resource: adjacency. The combinatory formulation of adjacency constrains possible grammars.
This book sets out the foundations, methodology, and practice of a formal framework for the description of language. The approach embraces the trends of lexicalism and compositional semantics in computational linguistics, and theoretical linguistics more broadly, by developing categorial grammar into a powerful and extendable logic of signs. Taking Montague Grammar as its point of departure, the book explains how integration of methods from philosophy (logical semantics), computer science (type theory), linguistics (categorial grammar) and meta-mathematics (mathematical logic ) provides a categorial foundation with coverage including intensionality, quantification, featural polymorphism, domains and constraints. For the first time, the book systematises categorial thinking into a unified program which is at once both logically secured, and a practical tool for pure lexical grammar development with type-theoretic semantics. It should be of interest to all those active in computational linguistics and formal grammar and is suitable for use at advanced undergraduate, postgraduate, and research levels.