ImpSuic: A quality updating rule in mixing coins with maximum utilities

Author(s):  
Xinying Yu ◽  
Zhaojie Wang ◽  
Yilei Wang ◽  
Fengyin Li ◽  
Tao Li ◽  
...  
Keyword(s):  
2019 ◽  
Vol 113 ◽  
pp. 770-780 ◽  
Author(s):  
Lianjie Jiang ◽  
Jiabin Wu

2011 ◽  
Vol 52-54 ◽  
pp. 926-931
Author(s):  
Qing Hua Zhou ◽  
Feng Xia Xu ◽  
Yan Geng ◽  
Ya Rui Zhang

Wedge trust region method based on traditional trust region is designed for derivative free optimization problems. This method adds a constraint to the trust region problem, which is called “wedge method”. The problem is that the updating strategy of wedge trust region radius is somewhat simple. In this paper, we develop and combine a new radius updating rule with this method. For most test problems, the number of function evaluations is reduced significantly. The experiments demonstrate the effectiveness of the improvement through our algorithm.


2020 ◽  
Vol 9 (4) ◽  
pp. 1
Author(s):  
Arman I. Mohammed ◽  
Ahmed AK. Tahir

A new optimization algorithm called Adam Meged with AMSgrad (AMAMSgrad) is modified and used for training a convolutional neural network type Wide Residual Neural Network, Wide ResNet (WRN), for image classification purpose. The modification includes the use of the second moment as in AMSgrad and the use of Adam updating rule but with and (2) as the power of the denominator. The main aim is to improve the performance of the AMAMSgrad optimizer by a proper selection of and the power of the denominator. The implementation of AMAMSgrad and the two known methods (Adam and AMSgrad) on the Wide ResNet using CIFAR-10 dataset for image classification reveals that WRN performs better with AMAMSgrad optimizer compared to its performance with Adam and AMSgrad optimizers. The accuracies of training, validation and testing are improved with AMAMSgrad over Adam and AMSgrad. AMAMSgrad needs less number of epochs to reach maximum performance compared to Adam and AMSgrad. With AMAMSgrad, the training accuracies are (90.45%, 97.79%, 99.98%, 99.99%) respectively at epoch (60, 120, 160, 200), while validation accuracy for the same epoch numbers are (84.89%, 91.53%, 95.05%, 95.23). For testing, the WRN with AMAMSgrad provided an overall accuracy of 94.8%. All these accuracies outrages those provided by WRN with Adam and AMSgrad. The classification metric measures indicate that the given architecture of WRN with the three optimizers performs significantly well and with high confidentiality, especially with AMAMSgrad optimizer.


Sign in / Sign up

Export Citation Format

Share Document