Unbounded Dependencies and Subjacency in a Tree Adjoining Grammar

1987 ◽  
pp. 143 ◽  
Author(s):  
Anthony S. Kroch
2017 ◽  
Author(s):  
Antonin Delpeuch ◽  
Anne Preller

We define an algorithm translating natural language sentences to the formal syntax of RDF, an existential conjunctive logic widely used on the Semantic Web. Our translationis based on pregroup grammars, an efficient type-logical grammatical framework with atransparent syntax-semantics interface. We introduce a restricted notion of side effects inthe semantic category of finitely generated free semimodules over {0,1} to that end.The translation gives an intensional counterpart to previous extensional models.We establish a one-to-one correspondence between extensional models and RDF models such that satisfaction is preserved. Our translation encompasses the expressivity of the target language and supports complex linguistic constructions like relative clauses and unbounded dependencies.


2021 ◽  
Vol 118 (41) ◽  
pp. e2026469118
Author(s):  
Laurel Perkins ◽  
Jeffrey Lidz

The human ability to produce and understand an indefinite number of sentences is driven by syntax, a cognitive system that can combine a finite number of primitive linguistic elements to build arbitrarily complex expressions. The expressive power of syntax comes in part from its ability to encode potentially unbounded dependencies over abstract structural configurations. How does such a system develop in human minds? We show that 18-mo-old infants are capable of representing abstract nonlocal dependencies, suggesting that a core property of syntax emerges early in development. Our test case is English wh-questions, in which a fronted wh-phrase can act as the argument of a verb at a distance (e.g., What did the chef burn?). Whereas prior work has focused on infants’ interpretations of these questions, we introduce a test to probe their underlying syntactic representations, independent of meaning. We ask when infants know that an object wh-phrase and a local object of a verb cannot co-occur because they both express the same argument relation (e.g., *What did the chef burn the pizza). We find that 1) 18 mo olds demonstrate awareness of this complementary distribution pattern and thus represent the nonlocal grammatical dependency between the wh-phrase and the verb, but 2) younger infants do not. These results suggest that the second year of life is a period of active syntactic development, during which the computational capacities for representing nonlocal syntactic dependencies become evident.


Author(s):  
Rui P. Chaves ◽  
Michael T. Putnam

This book is about one of the most intriguing features of human communication systems: the fact that words which go together in meaning can occur arbitrarily far away from each other. The kind of long-distance dependency that this volume is concerned with has been the subject of intense linguistic and psycholinguistic research for the last half century, and offers a unique insight into the nature of grammatical structures and their interaction with cognition. The constructions in which these unbounded dependencies arise are difficult to model and come with a rather puzzling array of constraints which have defied characterization and a proper explanation. For example, there are filler-gap dependencies in which the filler phrase is a plural phrase formed from the combination of each of the extracted phrases, and there are filler-gap constructions in which the filler phrase itself contains a gap that is linked to another filler phrase. What is more, different types of filler-gap dependency can compound, in the same sentence. Conversely, not all kinds of filler-gap dependencies are equally licit; some are robustly ruled out by the grammar whereas others have a less clear status because they have graded acceptability and can be made to improve in ideal contexts and conditions. This work provides a detailed survey of these linguistic phenomena and extant accounts, while also incorporating new experimental evidence to shed light on why the phenomena are the way they are and what important research on this topic lies ahead.


2020 ◽  
Vol 6 (1) ◽  
pp. 111-130
Author(s):  
Coppe van Urk

Every major theoretical approach to syntactic structure incorporates a mechanism for generating unbounded dependencies. In this article, I distinguish between some of the most commonly entertained mechanisms by looking in detail at one of the most fundamental discoveries about long-distance dependencies, the fact that they are successive cyclic. Most of the mechanisms posited in order to generate long-distance dependencies capture this property, but make different predictions about what reflexes of successive cyclicity should be attested across languages. In particular, theories of long-distance dependencies can be distinguished according to whether they propose intermediate occurrences of the moving phrases (movement theories) or whether intermediate heads carry features relevant to displacement (featural theories). I show that a full consideration of the typology of successive cyclicity provides clear evidence that both components are part of the syntax of long-distance dependencies. In addition, reflexes of successive cyclicity are equally distributed across the CP and vP edge, suggesting that these are parallel domains.


Sign in / Sign up

Export Citation Format

Share Document