rademacher complexity
Recently Published Documents


TOTAL DOCUMENTS

49
(FIVE YEARS 4)

H-INDEX

7
(FIVE YEARS 0)

Author(s):  
Alane M. de Lima ◽  
Murilo V.G. da Silva ◽  
André L. Vignatti

In this paper from communication channel coding perspective we are able to present both a theoretical and practical discussion of AI’s uncertainty, capacity and evolution for pattern classification based on the classical Rademacher complexity and Shannon entropy. First AI capacity is defined as in communication channels. It is shown qualitatively that the classical Rademacher complexity and Shannon rate in communication theory is closely related by their definitions. Secondly based on the Shannon mathematical theory on communication coding, we derive several sufficient and necessary conditions for an AI’s error rate approaching zero in classifications problems. A 1/2 criteria on Shannon entropy is derived in this paper so that error rate can approach zero or is zero for AI pattern classification problems. Last but not least, we show our analysis and theory by providing examples of AI pattern classifications with error rate approaching zero or being zero. Impact Statement: Error rate control of AI pattern classification is crucial in many lives related AI applications. AI uncertainty, capacity and evolution are investigated in this paper. Sufficient/necessary conditions for AI’s error rate approaching zero are derived based on Shannon’s communication coding theory. Zero error rate and zero error rate approaching AI design methodology for pattern classifications are illustrated using Shannon’s coding theory. Our method shows how to control the error rate of AI, how to measure the capacity of AI and how to evolve AI into higher levels. Index Terms: Rademacher Complexity, Shannon Theory, Shannon Entropy, Vapnik-Cheronenkis (VC) dimension.


Algorithms ◽  
2020 ◽  
Vol 13 (5) ◽  
pp. 123 ◽  
Author(s):  
Diego Santoro ◽  
Andrea Tonon ◽  
Fabio Vandin

Sequential pattern mining is a fundamental data mining task with application in several domains. We study two variants of this task—the first is the extraction of frequent sequential patterns, whose frequency in a dataset of sequential transactions is higher than a user-provided threshold; the second is the mining of true frequent sequential patterns, which appear with probability above a user-defined threshold in transactions drawn from the generative process underlying the data. We present the first sampling-based algorithm to mine, with high confidence, a rigorous approximation of the frequent sequential patterns from massive datasets. We also present the first algorithms to mine approximations of the true frequent sequential patterns with rigorous guarantees on the quality of the output. Our algorithms are based on novel applications of Vapnik-Chervonenkis dimension and Rademacher complexity, advanced tools from statistical learning theory, to sequential pattern mining. Our extensive experimental evaluation shows that our algorithms provide high-quality approximations for both problems we consider.


2020 ◽  
Vol 9 (2) ◽  
pp. 473-504 ◽  
Author(s):  
Noah Golowich ◽  
Alexander Rakhlin ◽  
Ohad Shamir

Abstract We study the sample complexity of learning neural networks by providing new bounds on their Rademacher complexity, assuming norm constraints on the parameter matrix of each layer. Compared to previous work, these complexity bounds have improved dependence on the network depth and, under some additional assumptions, are fully independent of the network size (both depth and width). These results are derived using some novel techniques, which may be of independent interest.


Bernoulli ◽  
2019 ◽  
Vol 25 (4B) ◽  
pp. 3912-3938
Author(s):  
Patrice Bertail ◽  
François Portier

Sign in / Sign up

Export Citation Format

Share Document