On-line processing of position and force measures for contour identification and robot control

Author(s):  
A. Fedele ◽  
A. Fioretti ◽  
C. Manes ◽  
G. Ulivi
Keyword(s):  
Sensors ◽  
2020 ◽  
Vol 20 (21) ◽  
pp. 6358
Author(s):  
Wojciech Kaczmarek ◽  
Jarosław Panasiuk ◽  
Szymon Borys ◽  
Patryk Banach

The paper presents the possibility of using the Kinect v2 module to control an industrial robot by means of gestures and voice commands. It describes the elements of creating software for off-line and on-line robot control. The application for the Kinect module was developed in the C# language in the Visual Studio environment, while the industrial robot control program was developed in the RAPID language in the RobotStudio environment. The development of a two-threaded application in the RAPID language allowed separating two independent tasks for the IRB120 robot. The main task of the robot is performed in Thread No. 1 (responsible for movement). Simultaneously, Thread No. 2 ensures continuous communication with the Kinect system and provides information about the gesture and voice commands in real time without any interference in Thread No. 1. The applied solution allows the robot to work in industrial conditions without the negative impact of the communication task on the time of the robot’s work cycles. Thanks to the development of a digital twin of the real robot station, tests of proper application functioning in off-line mode (without using a real robot) were conducted. The obtained results were verified on-line (on the real test station). Tests of the correctness of gesture recognition were carried out, and the robot recognized all programmed gestures. Another test carried out was the recognition and execution of voice commands. A difference in the time of task completion between the actual and virtual station was noticed; the average difference was 0.67 s. The last test carried out was to examine the impact of interference on the recognition of voice commands. With a 10 dB difference between the command and noise, the recognition of voice commands was equal to 91.43%. The developed computer programs have a modular structure, which enables easy adaptation to process requirements.


2012 ◽  
Vol 461 ◽  
pp. 109-112
Author(s):  
Du Kun Ding ◽  
Long Gen Li ◽  
Cun Xi Xie ◽  
Tie Zhang

In this paper, a DNA-PID controller is proposed for a 6-DOF robot. The experimental robot system has firstly been setup. Then the PID controllers of the robot joints are designed. Due to DNA algorithm’s excellent computing characteristics, it is researched and used to set the PID parameters on line, which are the proportional coefficient, the integral coefficient and the differential coefficient. To test the controllers, several experiments are performed. The computer simulation results show that the DNA-PID controllers have faster respond speed and less overshot, which can meet the need of robot control


2021 ◽  
Author(s):  
Kazutaka Kanno ◽  
Atsushi Uchida

Abstract Reinforcement learning has been intensively investigated and developed in artificial intelligence in the absence of training data, such as autonomous driving vehicles, robot control, and internet advertising. However, the computational cost of reinforcement learning with deep neural networks is extremely high, and reducing the learning cost is a challenging issue. We propose a photonic on-line implementation of reinforcement learning using optoelectronic delay-based reservoir computing, both experimentally and numerically. In the proposed scheme, we accelerate reinforcement learning at a rate of several megahertz because there is no required learning process for the internal connection weights in reservoir computing. We perform two benchmark tasks, CartPole-v0 and MountanCar-v0 tasks, to evaluate the proposed scheme. Our results represent the first hardware implementation of reinforcement learning based on photonic reservoir computing and paves the way for fast and efficient reinforcement learning as a novel photonic accelerator.


Robotica ◽  
1990 ◽  
Vol 8 (3) ◽  
pp. 231-243 ◽  
Author(s):  
Bruno Siciliano

SUMMARYA computationally fast inverse kinematic scheme is derived which solves robot's end-effector (EE) trajectories in terms of joint trajectories. The inverse kinematic problem (IKP) is cast as a control problem for a simple dynamic system. The resulting closed-loop algorithms are shown to guarantee satisfactory tracking performance. Differently from previous first-order schemes which only solve for joint positions and velocities, we propose here new second order tracking schemes which allow the on-line generation of joint position + velocity + acceleration (PVA) reference trajectories for any computed torque-like controller in sensor-based robot applications. The algorithms do explicitly solve the IKP for both EE position and orientation. Simulation results for a six-degree-of-freedom PUMA-like geometry demonstrate the effectiveness of the scheme, even near singularities.


Author(s):  
Wojciech Kaczmarek ◽  
Jarosław Panasiuk ◽  
Szymon Borys ◽  
Patryk Banach

The paper presents the possibility of using KINECT v2 module to control an industrial robot by means of gestures and voice commands. It describes elements of creating software for off-line and on-line robot control. The application for KINECT module was developed in C# language in Visual Studio environment, while the industrial robot control program was developed in RAPID language in RobotStudio environment. The development of a two-threaded application in RAPID language allowed to separate two independent tasks for the IRB120 robot. The main task of the robot is performed in thread no. 1 (responsible for movement). Simultaneously working thread no. 2 ensures continuous communication with the KINECT system and provides information about the gesture and voice commands in real time without any interference in thread no. 1. The applied solution allows the robot to work in industrial conditions without negative impact of communication task on the time of robot’s work cycles. Thanks to the development of a digital twin of the real robot station, tests of proper application functioning in off-line mode (without using a real robot) were conducted. Obtained results were verified online (on the real test station). Tests of correctness of gesture recognition were carried out, the robot recognized all programmed gestures. Another test carried out was the recognition and execution of voice commands. A difference in the time of task completion between the actual and virtual station was noticed - the average difference was 0.67 s. The last test carried out was to examine the impact of interference on the recognition of voice commands. With a 10dB difference between the command and noise, the recognition of voice commands was equal to 91.43%. The developed computer programs have a modular structure, which enables easy adaptation to process requirements.


Author(s):  
William Krakow

In the past few years on-line digital television frame store devices coupled to computers have been employed to attempt to measure the microscope parameters of defocus and astigmatism. The ultimate goal of such tasks is to fully adjust the operating parameters of the microscope and obtain an optimum image for viewing in terms of its information content. The initial approach to this problem, for high resolution TEM imaging, was to obtain the power spectrum from the Fourier transform of an image, find the contrast transfer function oscillation maxima, and subsequently correct the image. This technique requires a fast computer, a direct memory access device and even an array processor to accomplish these tasks on limited size arrays in a few seconds per image. It is not clear that the power spectrum could be used for more than defocus correction since the correction of astigmatism is a formidable problem of pattern recognition.


Author(s):  
A.M.H. Schepman ◽  
J.A.P. van der Voort ◽  
J.E. Mellema

A Scanning Transmission Electron Microscope (STEM) was coupled to a small computer. The system (see Fig. 1) has been built using a Philips EM400, equipped with a scanning attachment and a DEC PDP11/34 computer with 34K memory. The gun (Fig. 2) consists of a continuously renewed tip of radius 0.2 to 0.4 μm of a tungsten wire heated just below its melting point by a focussed laser beam (1). On-line operation procedures were developped aiming at the reduction of the amount of radiation of the specimen area of interest, while selecting the various imaging parameters and upon registration of the information content. Whereas the theoretical limiting spot size is 0.75 nm (2), routine resolution checks showed minimum distances in the order 1.2 to 1.5 nm between corresponding intensity maxima in successive scans. This value is sufficient for structural studies of regular biological material to test the performance of STEM over high resolution CTEM.


Author(s):  
Neil Rowlands ◽  
Jeff Price ◽  
Michael Kersker ◽  
Seichi Suzuki ◽  
Steve Young ◽  
...  

Three-dimensional (3D) microstructure visualization on the electron microscope requires that the sample be tilted to different positions to collect a series of projections. This tilting should be performed rapidly for on-line stereo viewing and precisely for off-line tomographic reconstruction. Usually a projection series is collected using mechanical stage tilt alone. The stereo pairs must be viewed off-line and the 60 to 120 tomographic projections must be aligned with fiduciary markers or digital correlation methods. The delay in viewing stereo pairs and the alignment problems in tomographic reconstruction could be eliminated or improved by tilting the beam if such tilt could be accomplished without image translation.A microscope capable of beam tilt with simultaneous image shift to eliminate tilt-induced translation has been investigated for 3D imaging of thick (1 μm) biologic specimens. By tilting the beam above and through the specimen and bringing it back below the specimen, a brightfield image with a projection angle corresponding to the beam tilt angle can be recorded (Fig. 1a).


Sign in / Sign up

Export Citation Format

Share Document