scholarly journals Cleaning Sensor Data in Intelligent Heating Control System

Keyword(s):  
Author(s):  
Sunghoon Kim ◽  
H. Kazerooni

A networked control system (NCS) is a control architecture where sensors, actuators and controllers are distributed and interconnected. It is advantageous in terms of interoperability, expandability, installation, volume of wiring, maintenance, and cost-effectiveness. Many distributed network systems of various topologies and network type have been developed, but NCS systems tend to suffer from such issues as nondeterminism, long network delays, large overheads and unfairness. This paper presents the ring-based protocol, called the ExoNet, and its network architecture which are built to achieve better performance as a distributed networked system. A Cypress transceiver CY7C924ADX is applied to the network as a communication unit. The protocol is based on the transceiver and developed to achieve fast communication and allowable latency for controls with high control loop frequency. Compared with other standard network types such as Ethernet, ControlNet or DeviceNet, the network is characterized by its ring-based architecture, simple message and packet formats, one-shot distribution of control data and collection of sensor data, multi-node transmission, echo of a message, and other features. The network also guarantees determinism, collision-free transmission, relatively small overhead, fairness between nodes and flexibility in configuration. Its analysis and comparison with these network types are also provided and its application on the Berkeley Lower-Extremity Exoskeleton (BLEEX) is described.


2006 ◽  
Vol 3 (1) ◽  
pp. 29-41 ◽  
Author(s):  
J. J. Gu ◽  
M. Meng ◽  
A. Cook ◽  
P. X. Liu

Loss of an eye is a tragedy for a person, who may suffer psychologically and physically. This paper is concerned with the design, sensing and control of a robotic prosthetic eye that moves horizontally in synchronization with the movement of the natural eye. Two generations of robotic prosthetic eye models have been developed. The first generation model uses an external infrared sensor array mounted on the frame of a pair of eyeglasses to detect the natural eye movement and to feed the control system to drive the artificial eye to move with the natural eye. The second generation model removes the impractical usage of the eye glass frame and uses the human brain EOG (electro-ocular-graph) signal picked up by electrodes placed on the sides of a person's temple to carry out the same eye movement detection and control tasks as mentioned above. Theoretical issues on sensor failure detection and recovery, and signal processing techniques used in sensor data fusion, are studied using statistical methods and artificial neural network based techniques. In addition, practical control system design and implementation using micro-controllers are studied and implemented to carry out the natural eye movement detection and artificial robotic eye control tasks. Simulation and experimental studies are performed, and the results are included to demonstrate the effectiveness of the research project reported in this paper.


2021 ◽  
Author(s):  
Mingda Miao ◽  
Xueshan Gao ◽  
Jun Zhao ◽  
Peng Zhao

Abstract Background: In response to the current problem of low intelligence of mobile lower limb motor rehabilitation aids, this article proposes an intelligent control scheme based on human movement behavior in order to control the rehabilitation robot to follow the patient's movement. Methods: Firstly, a multi-sensor data acquisition system is designed according to the motion characteristics of human body. By analyzing and processing the motion data, the change law of human center of gravity and behavior intention are obtained, and the behavior intention of human is used as the control command of the robot following motion. In order to achieve the goal of the rehabilitation robot following human motion, an adaptive radial basis function neural network (ARBFNN) sliding mode controller is designed based on the robot dynamic model. The controller can reduce the impact of fluctuations in the human center of gravity on changes in the parameters of the robot control system, and enhance the adaptability of the system to other disturbance factors, and improve the accuracy of following human motion. Finally, the motion following experiment of the rehabilitation robot is carried out. Results: The experimental results show that the robot can recognize the motion intention of human body, and achieve the training goal of following different subjects to complete straight lines and curves. Conclusions: According to the experimental results, the accuracy of the multi-sensor data acquisition system and control algorithm design is verified, which demonstrates the feasibility of the proposed intelligent control scheme.


Sensors ◽  
2020 ◽  
Vol 20 (16) ◽  
pp. 4411 ◽  
Author(s):  
Hongyan Tang ◽  
Dan Zhang ◽  
Zhongxue Gan

Vertical take-off and landing unmanned aerial vehicles (VTOL UAV) are widely used in various fields because of their stable flight, easy operation, and low requirements for take-off and landing environments. To further expand the UAV’s take-off and landing environment to include a non-structural complex environment, this study developed a landing gear robot for VTOL vehicles. This article mainly introduces the adaptive landing control of the landing gear robot in an unstructured environment. Based on the depth camera (TOF camera), IMU, and optical flow sensor, the control system achieves multi-sensor data fusion and uses a robotic kinematical model to achieve adaptive landing. Finally, this study verifies the feasibility and effectiveness of adaptive landing through experiments.


2017 ◽  
Vol 50 (5) ◽  
pp. 660-680 ◽  
Author(s):  
F Tan ◽  
D Caicedo ◽  
A Pandharipande ◽  
M Zuniga

Smart indoor lighting systems use occupancy and light sensor data to adapt artificial lighting in accordance with changing occupancy and daylight conditions. Such systems can be designed to reduce lighting energy consumption significantly. However, these systems cannot account for individual user preferences at the workplace in real time. We propose a sensor-driven, human-in-the-loop lighting system that incorporates user feedback in addition to occupancy and light sensor inputs. In this system, luminaires transmit unique visible light communication identifier signals. By processing the image captured by a smartphone camera, a user obtains two pieces of information: visible light communication identifiers of luminaires in the vicinity and average image pixel value. A control algorithm is designed that incorporates these user inputs along with occupancy and light sensor inputs to determine the dimming levels of the luminaires to achieve illumination levels acceptable to users. We compare the performance of the proposed lighting control system with a sensor-driven lighting control system in an office test bed.


Author(s):  
Mikhail Zymbler ◽  
Yana Kraeva ◽  
Elizaveta Latypova ◽  
Sachin Kumar ◽  
Dmitry Shnayder ◽  
...  
Keyword(s):  

2014 ◽  
Vol 609-610 ◽  
pp. 801-806
Author(s):  
Hao Wang ◽  
Meng Nie ◽  
Qing An Huang

Intelligent weather station system based on MEMS sensors is designed. The automatic meteorological system includes a MEMS temperature sensor, MEMS humidity sensor, MEMS pressure sensor, MEMS wind speed sensor and the sensor intelligent control system, etc. The intelligent control system has functions such as precise timing, multiple sensor data automatic acquisition, storage and uploading, which realizes the intelligent control of this weather station system.


Sign in / Sign up

Export Citation Format

Share Document