Modelling Grammaticality-grading in Natural Language Systems Using a Vector Space Approach

2017 ◽  
Vol 23 (3) ◽  
pp. 1-15
Author(s):  
Moses Aregbesola ◽  
Rafiu Ganiyu ◽  
Stephen Olabiyisi ◽  
Elijah Omidiora ◽  
Oluwaseun Alo
1984 ◽  
Vol 21 (2) ◽  
pp. 236 ◽  
Author(s):  
Thomas J. Page ◽  
Morris L. Eaton

2019 ◽  
Vol 29 (06) ◽  
pp. 783-809
Author(s):  
Jules Hedges ◽  
Mehrnoosh Sadrzadeh

AbstractCategorical compositional distributional semantics is a model of natural language; it combines the statistical vector space models of words with the compositional models of grammar. We formalise in this model the generalised quantifier theory of natural language, due to Barwise and Cooper. The underlying setting is a compact closed category with bialgebras. We start from a generative grammar formalisation and develop an abstract categorical compositional semantics for it, and then instantiate the abstract setting to sets and relations and to finite-dimensional vector spaces and linear maps. We prove the equivalence of the relational instantiation to the truth theoretic semantics of generalised quantifiers. The vector space instantiation formalises the statistical usages of words and enables us to, for the first time, reason about quantified phrases and sentences compositionally in distributional semantics.


2015 ◽  
Vol 23 (3) ◽  
pp. 461-471 ◽  
Author(s):  
Ruiji Fu ◽  
Jiang Guo ◽  
Bing Qin ◽  
Wanxiang Che ◽  
Haifeng Wang ◽  
...  

1984 ◽  
Vol 39 (3) ◽  
pp. 507-515 ◽  
Author(s):  
U. B. DESAI ◽  
H. L. WEINERT

2015 ◽  
Author(s):  
Abdulaziz Alghunaim ◽  
Mitra Mohtarami ◽  
Scott Cyphers ◽  
Jim Glass

Sign in / Sign up

Export Citation Format

Share Document