scholarly journals A Novel Hardware Systolic Architecture of a Self-Organizing Map Neural Network

2019 ◽  
Vol 2019 ◽  
pp. 1-14 ◽  
Author(s):  
Khaled Ben Khalifa ◽  
Ahmed Ghazi Blaiech ◽  
Mohamed Hédi Bedoui

In this article, we propose to design a new modular architecture for a self-organizing map (SOM) neural network. The proposed approach, called systolic-SOM (SSOM), is based on the use of a generic model inspired by a systolic movement. This model is formed by two levels of nested parallelism of neurons and connections. Thus, this solution provides a distributed set of independent computations between the processing units called neuroprocessors (NPs) which define the SSOM architecture. The NP modules have an innovative architecture compared to those proposed in the literature. Indeed, each NP performs three different tasks without requiring additional external modules. To validate our approach, we evaluate the performance of several SOM network architectures after their integration on an FPGA support. This architecture has achieved a performance almost twice as fast as that obtained in the recent literature.

2019 ◽  
Vol 29 (01) ◽  
pp. 2050002
Author(s):  
Khaled Ben Khalifa ◽  
Ahmed Ghazi Blaiech ◽  
Mehdi Abadi ◽  
Mohamed Hedi Bedoui

In this paper, we present a new generic architectural approach of a Self-Organizing Map (SOM). The proposed architecture, called the Diagonal-SOM (D-SOM), is described as an Hardware–Description-Language as an intellectual property kernel with easily adjustable parameters.The D-SOM architecture is based on a generic formalism that exploits two levels of the nested parallelism of neurons and connections. This solution is therefore considered as a system based on the cooperation of a distributed set of independent computations. The organization and structure of these calculations process an oriented data flow in order to find a better treatment distribution between different neuroprocessors. To validate the D-SOM architecture, we evaluate the performance of several SOM network architectures after their integration on a Xilinx Virtex-7 Field Programmable Gate Array support. The proposed solution allows the easy adaptation of learning to a large number of SOM topologies without any considerable design effort. [Formula: see text] SOM hardware is validated through FPGA implementation, where temporal performance is almost twice as fast as that obtained in the recent literature. The suggested D-SOM architecture is also validated through simulation on variable-sized SOM networks applied to color vector quantization.


2014 ◽  
Vol 563 ◽  
pp. 308-311 ◽  
Author(s):  
Yu Lian Jiang

For a water polo ball game there are multiple water polos and multiple robotic fishes in each team, seeking a reasonable task allocation plan is the key point to win the game. To resolve the problem, this paper proposed a multi-target task allocation method based on the Self-organizing map (SOM) neural network. This method takes the position of the water polos as the input vector, competes and compares the position of the water polos and robotic fishes, outputs the corresponding robotic fish of each water polo. The robotic fish will move toward the target water polo when the weight was adjusted, and will finally reach the target water polo. Simulations show that the score of the team using this method is higher than another team. The results prove the correctness and reliability of this method.


2015 ◽  
Vol 2015 ◽  
pp. 1-8 ◽  
Author(s):  
Xiuhui Tan ◽  
Hongping Hu ◽  
Rong Cheng ◽  
Yanping Bai

An effective two-level self-organizing map (SOM) neural network for direction of arrival (DOA) of sound signals estimation is proposed. The approach is based on the distance difference of arrival (DDOA) and a uniform linear sensor array in a 2D plane; it performs a nonlinear mapping between the DDOA vectors and angles of arrival (AOA). We found that the topological order of DDOA vectors and AOAs of same signals is uniform; thus, the topological order preserving of SOM network makes it valid to estimate AOA through DDOA. From the results of simulations and lake experiments, it is shown that the network has the advantage of accuracy and robustness, can be trained in advance, and is easy to implement.


2021 ◽  
Author(s):  
Gamal Alusta ◽  
Hossein Algdamsi ◽  
Ahmed Amtereg ◽  
Ammar Agnia ◽  
Ahmed Alkouh ◽  
...  

Abstract In this paper we introduce for the first time an innovative approach for deriving Oil Formation Volume Factor (Bo) by mean of artificial intelligence method. In a new proposed application Self-Organizing Map (SOM) technology has been merged with statistical prediction methods integrating in a single step dimensionality reduction, extraction of input data structure pattern and prediction of formation volume factor Bo. The SOM neural network method applies an unsupervised training algorithm combined with back propagation neural network BPNN to subdivide the entire set of PVT input into different patterns identifying a set of data that have something in common and run individual MLFF ANN models for each specific PVT cluster and computing Bo. PVT data for more than two hundred oil samples (total of 804 data points) were collected from the north African region representing different basin and covering a greater geographical area were used in this study. To establish clear Bound on the accuracy of Bo determination several statistical parameters and terminology included in the presentation of the result from SOM-Neural Network solution. the main outcome is the reduction of error obtained by the new proposed competitive Learning Structure integration of SOM and MLFF ANN to less than 1 % compared to other method. however also investigated in this work five independents means of model driven and data driven approach for estimating Bo theses are 1) Optimal Transformations for Multiple Regression as introduced by (McCain, 1998) using alternating conditional expectations (ACE) for selecting multiple regression transformations 2), Genetic programing and heuristic modeling using Symbolic Regression (SR) and cross validation for model automatic tuning 3) Machine learning predictive model (Nearest Neighbor Regression, Kernel Ridge regression, Gaussian Process Regression (GPR), Random Forest Regression (RF), Support Vector Regression (SVM), Decision Tree Regression (DT), Gradient Boosting Machine Regression (GBM), Group modeling data handling (GMDH). Regression Model Accuracy Metrics (Average absolute relative error, R-square), diagnostic plot was used to address the more adequate techniques and model for predicting Bo.


2018 ◽  
Vol 31 (4) ◽  
pp. 571-583
Author(s):  
Mahdi Farhadi

It is of vital importance to use proper training data to perform accurate shortterm load forecasting (STLF) based on artificial neural networks. The pattern of the loads which are used for the training of Kohonen Self Organizing Map (SOM) neural network in STLF models should be of the highest similarity with the pattern of the electric load of the forecasting day. In this paper, an electric load classifier model is proposed which relies on the pattern recognition capability of SOM. The performance of the proposed electric load classifier method is evaluated by Iran electric grid data. The proposed method requires a very few number of training samples for training the Kohonen neural network of the STLF model and can accurately predict electric load in the network.


2010 ◽  
Vol 19 (01) ◽  
pp. 191-202 ◽  
Author(s):  
VITOANTONIO BEVILACQUA ◽  
GIUSEPPE MASTRONARDI ◽  
VITO SANTARCANGELO ◽  
ROCCO SCARAMUZZI

In this paper, two different methodologies respectively based on an unsupervised self-organizing (SOM) neural network and on a graph matching are shown and discussed to validate the performance of a new 3D facial feature identification and localization algorithm. Experiments are performed on a dataset of 23 3D faces acquired by a 3D laser camera at eBIS lab with pose and expression variations. In particular results referred to five nose landmarks are encouraging and reveal the validity of this approach that although low computational complexity and the small number of landmarks guarantees an average face recognition performance greater than 80%.


Sign in / Sign up

Export Citation Format

Share Document