Machine Learning and the foundations of inductive inference

1993 ◽  
Vol 3 (1) ◽  
pp. 31-51 ◽  
Author(s):  
Francesco Bergadano
2002 ◽  
Vol 13 (03) ◽  
pp. 445-458 ◽  
Author(s):  
HANS ZANTEMA ◽  
HANS L. BODLAENDER

Decision tables provide a natural framework for knowledge acquisition and representation in the area of knowledge based information systems. Decision trees provide a standard method for inductive inference in the area of machine learning. In this paper we show how decision tables can be considered as ordered decision trees: decision trees satisfying an ordering restriction on the nodes. Every decision tree can be represented by an equivalent ordered decision tree, but we show that doing so may exponentially blow up sizes, even if the choice of the order is left free. Our main result states that finding an ordered decision tree of minimal size that represents the same function as a given ordered decision tree is an NP-hard problem; in earlier work we obtained a similar result for unordered decision trees.


2011 ◽  
pp. 210-219
Author(s):  
Thomas R. Shultz ◽  
Scott E. Fahlman ◽  
Susan Craw ◽  
Periklis Andritsos ◽  
Panayiotis Tsaparas ◽  
...  

2020 ◽  
Vol 43 ◽  
Author(s):  
Myrthe Faber

Abstract Gilead et al. state that abstraction supports mental travel, and that mental travel critically relies on abstraction. I propose an important addition to this theoretical framework, namely that mental travel might also support abstraction. Specifically, I argue that spontaneous mental travel (mind wandering), much like data augmentation in machine learning, provides variability in mental content and context necessary for abstraction.


Sign in / Sign up

Export Citation Format

Share Document