updating rule
Recently Published Documents


TOTAL DOCUMENTS

61
(FIVE YEARS 3)

H-INDEX

9
(FIVE YEARS 0)

Author(s):  
Xinying Yu ◽  
Zhaojie Wang ◽  
Yilei Wang ◽  
Fengyin Li ◽  
Tao Li ◽  
...  
Keyword(s):  

2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Avinesh Prasad ◽  
Bibhya Sharma ◽  
Jito Vanualailai ◽  
Sandeep Kumar

This paper essays a new solution to the landmark navigation problem of planar robots in the presence of randomly fixed obstacles through a new dynamic updating rule involving the orientation and steering angle parameters of a robot. The dynamic updating rule utilizes a first-order nonlinear ordinary differential equation for the changing of landmarks so that whenever a landmark is updated, the path followed by the robot remains continuous and smooth. This waypoints guidance is via specific landmarks selected from a new set of rules governing the robot’s field of view. The governing control laws guarantee asymptotic stability of the 2D point robot system. As an application, the landmark motion planning and control of a car-like mobile robot navigating in the presence of fixed elliptic-shaped obstacles are considered. The proposed control laws take into account the geometrical constraints imposed on steering angle and guarantee eventual uniform stability of the car-like system. Computer simulations, using Matlab software, are presented to illustrate the effectiveness of the proposed technique and its stabilizing algorithm.


2020 ◽  
Vol 139 ◽  
pp. 110067
Author(s):  
Jun Zhang ◽  
Bin Hu ◽  
Yi Jie Huang ◽  
Zheng Hong Deng ◽  
Tao Wu

2020 ◽  
Vol 9 (4) ◽  
pp. 1
Author(s):  
Arman I. Mohammed ◽  
Ahmed AK. Tahir

A new optimization algorithm called Adam Meged with AMSgrad (AMAMSgrad) is modified and used for training a convolutional neural network type Wide Residual Neural Network, Wide ResNet (WRN), for image classification purpose. The modification includes the use of the second moment as in AMSgrad and the use of Adam updating rule but with and (2) as the power of the denominator. The main aim is to improve the performance of the AMAMSgrad optimizer by a proper selection of and the power of the denominator. The implementation of AMAMSgrad and the two known methods (Adam and AMSgrad) on the Wide ResNet using CIFAR-10 dataset for image classification reveals that WRN performs better with AMAMSgrad optimizer compared to its performance with Adam and AMSgrad optimizers. The accuracies of training, validation and testing are improved with AMAMSgrad over Adam and AMSgrad. AMAMSgrad needs less number of epochs to reach maximum performance compared to Adam and AMSgrad. With AMAMSgrad, the training accuracies are (90.45%, 97.79%, 99.98%, 99.99%) respectively at epoch (60, 120, 160, 200), while validation accuracy for the same epoch numbers are (84.89%, 91.53%, 95.05%, 95.23). For testing, the WRN with AMAMSgrad provided an overall accuracy of 94.8%. All these accuracies outrages those provided by WRN with Adam and AMSgrad. The classification metric measures indicate that the given architecture of WRN with the three optimizers performs significantly well and with high confidentiality, especially with AMAMSgrad optimizer.


2019 ◽  
Vol 128 ◽  
pp. 313-317
Author(s):  
Yi Jie Huang ◽  
Zheng Hong Deng ◽  
Qun Song ◽  
Tao Wu ◽  
Zhi Long Deng ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document