scholarly journals Optimizing over subsequences generates context-sensitive languages

2021 ◽  
Vol 9 ◽  
pp. 528-537
Author(s):  
Andrew Lamont

Abstract Phonological generalizations are finite-state. While Optimality Theory is a popular framework for modeling phonology, it is known to generate non-finite-state mappings and languages. This paper demonstrates that Optimality Theory is capable of generating non-context-free languages, contributing to the characterization of its generative capacity. This is achieved with minimal modification to the theory as it is standardly employed.

1996 ◽  
Vol 23 (1) ◽  
pp. 57-79 ◽  
Author(s):  
Daniel A. Dinnsen

ABSTRACTSeveral competing proposals for the (under)specification of phonological representations are evaluated against the facts of phonemic acquisition. Longitudinal evidence relating to the emergence of a voice contrast in the well-documented study of Amahl (from age 2;2 to 3;11) is reconsidered. Neither contrastive specification nor context-free radical underspecification is capable of accounting for the facts. The problem is in the characterization of the change in the status of a feature from being noncontrastive and conditioned by context at one stage to being contrastive with phonetic effects that diffuse gradually through the lexicon. Both frameworks must treat as accidental the persistence of the early substitution pattern and require the postulation of wholesale changes in underlying representations, where these changes do not accord well with the observed phonetic changes or with the facts available to the learner. Context-sensitive radical underspecification provides a plausible account of each stage and the transition between stages with minimal grammar change.


2013 ◽  
Vol 24 (06) ◽  
pp. 747-763 ◽  
Author(s):  
STEFANO CRESPI REGHIZZI ◽  
PIERLUIGI SAN PIETRO

A recent language definition device named consensual is based on agreement between similar words. Considering a language over a bipartite alphabet made by pairs of unmarked/marked letters, the match relation specifies when such words agree. Thus a set (the “base”) over the bipartite alphabet consensually specifies another language that includes any terminal word such that a set of corresponding matching words is in the base. We show that all and only the regular languages are consensually generated by a strictly locally testable base; the result is based on a generalization of Medvedev's homomorphic characterization of regular languages. Consensually context-free languages strictly include the base family. The consensual and the base families collapse together if the base is context-sensitive.


2018 ◽  
Vol 29 (04) ◽  
pp. 663-685 ◽  
Author(s):  
Kent Kwee ◽  
Friedrich Otto

While (stateless) deterministic ordered restarting automata accept exactly the regular languages, it has been observed that nondeterministic ordered restarting automata (ORWW-automata for short) are more expressive. Here we show that the class of languages accepted by the latter automata is an abstract family of languages that is incomparable to the linear, the context-free, and the growing context-sensitive languages with respect to inclusion, and that the emptiness problem is decidable for these languages. In addition, we give a construction that turns a stateless ORWW-automaton into a nondeterministic finite-state acceptor for the same language.


2001 ◽  
Vol 13 (9) ◽  
pp. 2093-2118 ◽  
Author(s):  
Paul Rodriguez

It has been shown that if a recurrent neural network (RNN) learns to process a regular language, one can extract a finite-state machine (FSM) by treating regions of phase-space as FSM states. However, it has also been shown that one can construct an RNN to implement Turing machines by using RNN dynamics as counters. But how does a network learn languages that require counting? Rodriguez, Wiles, and Elman (1999) showed that a simple recurrent network (SRN) can learn to process a simple context-free language (CFL) by counting up and down. This article extends that to show a range of language tasks in which an SRN develops solutions that not only count but also copy and store counting information. In one case, the network stores information like an explicit storage mechanism. In other cases, the network stores information more indirectly in trajectories that are sensitive to slight displacements that depend on context. In this sense, an SRN can learn analog computation as a set of interdependent counters. This demonstrates how SRNs may be an alternative psychological model of language or sequence processing.


1996 ◽  
Vol 2 (4) ◽  
pp. 287-290 ◽  
Author(s):  
ANDRÁS KORNAI

In spite of the wide availability of more powerful (context free, mildly context sensitive, and even Turing-equivalent) formalisms, the bulk of the applied work on language and sublanguage modeling, especially for the purposes of recognition and topic search, is still performed by various finite state methods. In fact, the use of such methods in research labs as well as in applied work actually increased in the past five years. To bring together those developing and using extended finite state methods to text analysis, speech/OCR language modeling, and related CL and NLP tasks with those in AI and CS interested in analyzing and possibly extending the domain of finite state algorithms, a workshop was held in August 1996 in Budapest as part of the European Conference on Artificial Intelligence (ECAI'96).


2016 ◽  
Vol 26 (03) ◽  
pp. 1650012
Author(s):  
Stefan D. Bruda ◽  
Mary Sarah Ruth Wilkin

Coverability trees offer a finite characterization of all the derivations of a context-free parallel grammar system (CF-PCGS). Their finite nature implies that they necessarily omit some information about these derivations. We demonstrate that the omitted information is most if not all of the time too much, and so coverability trees are not useful as an analysis tool except for their limited use already considered in the paper that introduces them (namely, determining the decidability of certain decision problems over PCGS). We establish this result by invalidating an existing proof that synchronized CF-PCGS are less expressive than context-sensitive grammars. Indeed, we discover that this proof relies on coverability trees for CF-PCGS, but that such coverability trees do not in fact contain enough information to support the proof.


Computability ◽  
2021 ◽  
pp. 1-27
Author(s):  
Martin Vu ◽  
Henning Fernau

In this paper, we discuss the addition of substitutions as a further type of operations to (in particular, context-free) insertion-deletion systems, i.e., in addition to insertions and deletions we allow single letter replacements to occur. We investigate the effect of the addition of substitution rules on the context dependency of such systems, thereby also obtaining new characterizations of and even normal forms for context-sensitive (CS) and recursively enumerable (RE) languages and their phrase-structure grammars. More specifically, we prove that for each RE language, there is a system generating this language that only inserts and deletes strings of length two without considering the context of the insertion or deletion site, but which may change symbols (by a substitution operation) by checking a single symbol to the left of the substitution site. When we allow checking left and right single-letter context in substitutions, even context-free insertions and deletions of single letters suffice to reach computational completeness. When allowing context-free insertions only, checking left and right single-letter context in substitutions gives a new characterization of CS. This clearly shows the power of this new type of rules.


2002 ◽  
Vol 14 (9) ◽  
pp. 2039-2041 ◽  
Author(s):  
J. Schmidhuber ◽  
F. Gers ◽  
D. Eck

In response to Rodriguez's recent article (2001), we compare the performance of simple recurrent nets and long short-term memory recurrent nets on context-free and context-sensitive languages.


1996 ◽  
Vol 25 (4) ◽  
pp. 587-612 ◽  
Author(s):  
Maria M. Egbert

ABSTRACTJust as turn-taking has been found to be both context-free and context-sensitive (Sacks, Schegloff & Jefferson 1974), the organization of repair is also shown here to be both context-free and context-sensitive. In a comparison of American and German conversation, repair can be shown to be context-free in that, basically, the same mechanism can be found across these two languages. However, repair is also sensitive to the linguistic inventory of a given language; in German, morphological marking, syntactic constraints, and grammatical congruity across turns are used as interactional resources. In addition, repair is sensitive to certain characteristics of social situations. The selection of a particular repair initiator, Germanbitte?‘pardon?’, indexes that there is no mutual gaze between interlocutors; i.e., there is no common course of action. The selection ofbitte?not only initiates repair; it also spurs establishment of mutual gaze, and thus displays that there is attention to a common focus. (Conversation analysis, context, cross-linguistic analysis, repair, gaze, telephone conversation, co-present interaction, grammar and interaction)


Sign in / Sign up

Export Citation Format

Share Document