Network structure formation in thermally-induced gelation of glycinin

1984 ◽  
Vol 32 (2) ◽  
pp. 349-352 ◽  
Author(s):  
Takashi Nakamura ◽  
Shigeru Utsumi ◽  
Tomohiko Mori
Langmuir ◽  
2001 ◽  
Vol 17 (14) ◽  
pp. 4189-4195 ◽  
Author(s):  
Makoto Harada ◽  
Shintaro Itakura ◽  
Akihisa Shioi ◽  
Motonari Adachi

Author(s):  
L. AERTS ◽  
H. BERGHMANS ◽  
P. MOLDENAERS ◽  
J. MEWIS ◽  
M. KUNZ

Author(s):  
Hiroshi Shiratsuchi ◽  
◽  
Hiromu Gotanda ◽  
Katsuhiro Inoue ◽  
Kousuke Kumamaru ◽  
...  

In this paper, our proposed initialization for multilayer neural networks (NN) applies to the structural learning with forgetting. Initialization consists of two steps: weights of hidden units are initialized so that their hyperplanes pass through the center of gravity of an input pattern set, and weights of output units are initialized to zero. Several simulations were performed to study how the initialization effects the structure formation of the NN. From the simulation result, it was confirmed that the initialization gives better network structure and higher generalization ability.


Sign in / Sign up

Export Citation Format

Share Document