scholarly journals A GRU-Based Method for Predicting Intention of Aerial Targets

2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Fei Teng ◽  
Yafei Song ◽  
Gang Wang ◽  
Peng Zhang ◽  
Liuxing Wang ◽  
...  

Since a target’s operational intention in air combat is realized by a series of tactical maneuvers, its state presents the characteristics of temporal and dynamic changes. Depending only on a single moment to take inference, the traditional combat intention recognition method is neither scientific nor effective enough. Based on a gated recurrent unit (GRU), a bidirectional propagation mechanism and attention mechanism are introduced in a proposed aerial target combat intention recognition method. The proposed method constructs an air combat intention characteristic set through a hierarchical approach, encodes into numeric time-series characteristics, and encapsulates domain expert knowledge and experience in labels. It uses a bidirectional gated recurrent units (BiGRU) network for deep learning of air combat characteristics and adaptively assigns characteristic weights using an attention mechanism to improve the accuracy of aerial target combat intention recognition. In order to further shorten the time for intention recognition and with a certain predictive effect, an air combat characteristic prediction module is introduced before intention recognition to establish the mapping relationship between predicted characteristics and combat intention types. Simulation experiments show that the proposed model can predict enemy aerial target combat intention one sampling point ahead of time based on 89.7% intent recognition accuracy, which has reference value and theoretical significance for assisting decision-making in real-time intention recognition.

Author(s):  
HaoJie Ma ◽  
Wenzhong Li ◽  
Xiao Zhang ◽  
Songcheng Gao ◽  
Sanglu Lu

Sensor-based human activity recognition is a fundamental research problem in ubiquitous computing, which uses the rich sensing data from multimodal embedded sensors such as accelerometer and gyroscope to infer human activities. The existing activity recognition approaches either rely on domain knowledge or fail to address the spatial-temporal dependencies of the sensing signals. In this paper, we propose a novel attention-based multimodal neural network model called AttnSense for multimodal human activity recognition. AttnSense introduce the framework of combining attention mechanism with a convolutional neural network (CNN) and a Gated Recurrent Units (GRU) network to capture the dependencies of sensing signals in both spatial and temporal domains, which shows advantages in prioritized sensor selection and improves the comprehensibility. Extensive experiments based on three public datasets show that AttnSense achieves a competitive performance in activity recognition compared with several state-of-the-art methods.


Electronics ◽  
2020 ◽  
Vol 9 (12) ◽  
pp. 2176
Author(s):  
Lu Zhu ◽  
Zhuo Wang ◽  
Zhigang Ning ◽  
Yu Zhang ◽  
Yida Liu ◽  
...  

To solve the complexity of the traditional motion intention recognition method using a multi-mode sensor signal and the lag of the recognition process, in this paper, an inertial sensor-based motion intention recognition method for a soft exoskeleton is proposed. Compared with traditional motion recognition, in addition to the classic five kinds of terrain, the recognition of transformed terrain is also added. In the mode acquisition, the sensors’ data in the thigh and calf in different motion modes are collected. After a series of data preprocessing, such as data filtering and normalization, the sliding window is used to enhance the data, so that each frame of inertial measurement unit (IMU) data keeps the last half of the previous frame’s historical information. Finally, we designed a deep convolution neural network which can learn to extract discriminant features from temporal gait period to classify different terrain. The experimental results show that the proposed method can recognize the pose of the soft exoskeleton in different terrain, including walking on flat ground, going up and downstairs, and up and down slopes. The recognition accuracy rate can reach 97.64%. In addition, the recognition delay of the conversion pattern, which is converted between the five modes, only accounts for 23.97% of a gait cycle. Finally, the oxygen consumption was measured by the wearable metabolic system (COSMED K5, The Metabolic Company, Rome, Italy), and compared with that without an identification method; the net metabolism was reduced by 5.79%. The method in this paper can greatly improve the control performance of the flexible lower extremity exoskeleton system and realize the natural and seamless state switching of the exoskeleton between multiple motion modes according to the human motion intention.


Sign in / Sign up

Export Citation Format

Share Document