Analysis of voice imitation by professional/non-professional impersonators based on Kullback–Leibler divergence between acoustic models

2016 ◽  
Vol 140 (4) ◽  
pp. 3115-3115 ◽  
Author(s):  
Koji Iwano ◽  
Takuto Horihata
Author(s):  
Ryan Ka Yau Lai ◽  
Youngah Do

This article explores a method of creating confidence bounds for information-theoretic measures in linguistics, such as entropy, Kullback-Leibler Divergence (KLD), and mutual information. We show that a useful measure of uncertainty can be derived from simple statistical principles, namely the asymptotic distribution of the maximum likelihood estimator (MLE) and the delta method. Three case studies from phonology and corpus linguistics are used to demonstrate how to apply it and examine its robustness against common violations of its assumptions in linguistics, such as insufficient sample size and non-independence of data points.


Author(s):  
Masayuki Suzuki ◽  
Ryuki Tachibana ◽  
Samuel Thomas ◽  
Bhuvana Ramabhadran ◽  
George Saon

2017 ◽  
Author(s):  
Michael Heck ◽  
Masayuki Suzuki ◽  
Takashi Fukuda ◽  
Gakuto Kurata ◽  
Satoshi Nakamura
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document