hick’s law
Recently Published Documents


TOTAL DOCUMENTS

32
(FIVE YEARS 1)

H-INDEX

11
(FIVE YEARS 0)

2020 ◽  
Vol 117 (41) ◽  
pp. 25505-25516
Author(s):  
Birgit Kriener ◽  
Rishidev Chaudhuri ◽  
Ila R. Fiete

An elemental computation in the brain is to identify the best in a set of options and report its value. It is required for inference, decision-making, optimization, action selection, consensus, and foraging. Neural computing is considered powerful because of its parallelism; however, it is unclear whether neurons can perform this max-finding operation in a way that improves upon the prohibitively slow optimal serial max-finding computation (which takes∼N⁡log(N)time for N noisy candidate options) by a factor of N, the benchmark for parallel computation. Biologically plausible architectures for this task are winner-take-all (WTA) networks, where individual neurons inhibit each other so only those with the largest input remain active. We show that conventional WTA networks fail the parallelism benchmark and, worse, in the presence of noise, altogether fail to produce a winner when N is large. We introduce the nWTA network, in which neurons are equipped with a second nonlinearity that prevents weakly active neurons from contributing inhibition. Without parameter fine-tuning or rescaling as N varies, the nWTA network achieves the parallelism benchmark. The network reproduces experimentally observed phenomena like Hick’s law without needing an additional readout stage or adaptive N-dependent thresholds. Our work bridges scales by linking cellular nonlinearities to circuit-level decision-making, establishes that distributed computation saturating the parallelism benchmark is possible in networks of noisy, finite-memory neurons, and shows that Hick’s law may be a symptom of near-optimal parallel decision-making with noisy input.


Author(s):  
Wanyu Liu ◽  
Julien Gori ◽  
Olivier Rioul ◽  
Michel Beaudouin-Lafon ◽  
Yves Guiard
Keyword(s):  

2018 ◽  
Vol 71 (6) ◽  
pp. 1281-1299 ◽  
Author(s):  
Robert W Proctor ◽  
Darryl W Schneider

In 1952, W. E. Hick published an article in the Quarterly Journal of Experimental Psychology, “On the rate of gain of information.” It played a seminal role in the cognitive revolution and established one of the few widely acknowledged laws in psychology, relating choice reaction time to the number of stimulus–response alternatives (or amount of uncertainty) in a task. We review the historical context in which Hick conducted his study and describe his experiments and theoretical analyses. We discuss the article’s immediate impact on researchers, as well as challenges to and shortcomings of Hick’s law and his analysis, including effects of stimulus–response compatibility, practice, very large set sizes and sequential dependencies. Contemporary modeling developments are also described in detail. Perhaps most impressive about Hick’s law is that it continues to spawn research efforts to the present and that it is regarded as a fundamental law of interface design for human–computer interaction using technologies that did not exist at the time of Hick’s research.


2016 ◽  
Vol 6 (1) ◽  
Author(s):  
Rodrigo Pavão ◽  
Joice P. Savietto ◽  
João R. Sato ◽  
Gilberto F. Xavier ◽  
André F. Helene

2011 ◽  
Vol 62 (3) ◽  
pp. 193-222 ◽  
Author(s):  
Darryl W. Schneider ◽  
John R. Anderson
Keyword(s):  

2010 ◽  
Vol 73 (3) ◽  
pp. 854-871 ◽  
Author(s):  
Charles E. Wright ◽  
Valerie F. Marino ◽  
Charles Chubb ◽  
Kelsey A. Rose
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document