scholarly journals Bi-Objective Optimization for Industrial Robotics Workflow Resource Allocation in an Edge–Cloud Environment

2021 ◽  
Vol 11 (21) ◽  
pp. 10066
Author(s):  
Xingju Xie ◽  
Xiaojun Wu ◽  
Qiao Hu

The application scenarios and market shares of industrial robots have been increasing in recent years, and with them comes a huge market and technical demand for industrial robot-monitoring system (IRMS). With the development of IoT and cloud computing technologies, industrial robot monitoring has entered the cloud computing era. However, the data of industrial robot-monitoring tasks have characteristics of large data volume and high information redundancy, and need to occupy a large amount of communication bandwidth in cloud computing architecture, so cloud-based IRMS has gradually become unable to meet its performance and cost requirements. Therefore, this work constructs edge–cloud architecture for the IRMS. The industrial robot-monitoring task will be executed in the form of workflow and the local monitor will allocate computing resources for the subtasks of the workflow by analyzing the current situation of the edge–cloud network. In this work, the allocation problem of industrial robot-monitoring workflow is modeled as a latency and cost bi-objective optimization problem, and its solution is based on the evolutionary algorithm of the heuristic improvement NSGA-II. The experimental results demonstrate that the proposed algorithm can find non-dominated solutions faster and be closer to the Pareto frontier of the problem. The monitor can select an effective solution in the Pareto frontier to meet the needs of the monitoring task.

Robotica ◽  
2008 ◽  
Vol 26 (6) ◽  
pp. 753-765 ◽  
Author(s):  
R. Saravanan ◽  
S. Ramabalan ◽  
C. Balamurugan

SUMMARYA general new methodology using evolutionary algorithms viz., Elitist Non-dominated Sorting Genetic Algorithm (NSGA-II) and Multi-objective Differential Evolution (MODE), for obtaining optimal trajectory planning of an industrial robot manipulator (PUMA 560 robot) in the presence of fixed and moving obstacles with payload constraint is presented. The problem has a multi-criterion character in which six objective functions, 32 constraints and 288 variables are considered. A cubic NURBS curve is used to define the trajectory. The average fuzzy membership function method is used to select the best optimal solution from Pareto optimal fronts. Two multi-objective performance measures namely solution spread measure and ratio of non-dominated individuals are used to evaluate the strength of Pareto optimal fronts. Two more multi-objective performance measures namely optimiser overhead and algorithm effort are used to find computational effort of the NSGA-II and MODE algorithms. The Pareto optimal fronts and results obtained from various techniques are compared and analysed. Both NSGA-II and MODE are best for this problem.


Author(s):  
. Monika ◽  
Pardeep Kumar ◽  
Sanjay Tyagi

In Cloud computing environment QoS i.e. Quality-of-Service and cost is the key element that to be take care of. As, today in the era of big data, the data must be handled properly while satisfying the request. In such case, while handling request of large data or for scientific applications request, flow of information must be sustained. In this paper, a brief introduction of workflow scheduling is given and also a detailed survey of various scheduling algorithms is performed using various parameter.


Author(s):  
Marek Vagas

Urgency of the research. Automated workplaces are growing up in present, especially with implementation of industrial robots with feasibility of various dispositions, where safety and risk assessment is considered as most important issues. Target setting. The protection of workers must be at the first place, therefore safety and risk assessment at automated workplaces is most important problematic, which had presented in this article Actual scientific researches and issues analysis. Actual research is much more focused at standard workplaces without industrial robots. So, missing of information from the field of automated workplaces in connection with various dispositions can be considered as added value of article. Uninvestigated parts of general matters defining. Despite to lot of general safety instructions in this area, still is missed clear view only at automated workplace with industrial robots. The research objective. The aim of article is to provide general instructions directly from the field of automated workplaces The statement of basic materials. For success realization of automated workplace is good to have a helping hand and orientation requirements needed for risk assessment at the workplace. Conclusions. The results published in this article increase the awareness and information of such automated workplaces, together with industrial robots. In addition, presented general steps and requirements helps persons for better realization of these types of workplaces, where major role takes an industrial robot. Our proposed solution can be considered as relevant base for risk assessment such workplaces with safety fences or light barriers.


2021 ◽  
Vol 21 (2) ◽  
pp. 1-22
Author(s):  
Chen Zhang ◽  
Zhuo Tang ◽  
Kenli Li ◽  
Jianzhong Yang ◽  
Li Yang

Installing a six-dimensional force/torque sensor on an industrial arm for force feedback is a common robotic force control strategy. However, because of the high price of force/torque sensors and the closedness of an industrial robot control system, this method is not convenient for industrial mass production applications. Various types of data generated by industrial robots during the polishing process can be saved, transmitted, and applied, benefiting from the growth of the industrial internet of things (IIoT). Therefore, we propose a constant force control system that combines an industrial robot control system and industrial robot offline programming software for a polishing robot based on IIoT time series data. The system mainly consists of four parts, which can achieve constant force polishing of industrial robots in mass production. (1) Data collection module. Install a six-dimensional force/torque sensor at a manipulator and collect the robot data (current series data, etc.) and sensor data (force/torque series data). (2) Data analysis module. Establish a relationship model based on variant long short-term memory which we propose between current time series data of the polishing manipulator and data of the force sensor. (3) Data prediction module. A large number of sensorless polishing robots of the same type can utilize that model to predict force time series. (4) Trajectory optimization module. The polishing trajectories can be adjusted according to the prediction sequences. The experiments verified that the relational model we proposed has an accurate prediction, small error, and a manipulator taking advantage of this method has a better polishing effect.


Symmetry ◽  
2021 ◽  
Vol 13 (2) ◽  
pp. 226
Author(s):  
Xuyang Zhao ◽  
Cisheng Wu ◽  
Duanyong Liu

Within the context of the large-scale application of industrial robots, methods of analyzing the life-cycle cost (LCC) of industrial robot production have shown considerable developments, but there remains a lack of methods that allow for the examination of robot substitution. Taking inspiration from the symmetry philosophy in manufacturing systems engineering, this article further establishes a comparative LCC analysis model to compare the LCC of the industrial robot production with traditional production at the same time. This model introduces intangible costs (covering idle loss, efficiency loss and defect loss) to supplement the actual costs and comprehensively uses various methods for cost allocation and variable estimation to conduct total cost and the cost efficiency analysis, together with hierarchical decomposition and dynamic comparison. To demonstrate the model, an investigation of a Chinese automobile manufacturer is provided to compare the LCC of welding robot production with that of manual welding production; methods of case analysis and simulation are combined, and a thorough comparison is done with related existing works to show the validity of this framework. In accordance with this study, a simple template is developed to support the decision-making analysis of the application and cost management of industrial robots. In addition, the case analysis and simulations can provide references for enterprises in emerging markets in relation to robot substitution.


2021 ◽  
Vol 11 (3) ◽  
pp. 1287
Author(s):  
Tianyan Chen ◽  
Jinsong Lin ◽  
Deyu Wu ◽  
Haibin Wu

Based on the current situation of high precision and comparatively low APA (absolute positioning accuracy) in industrial robots, a calibration method to enhance the APA of industrial robots is proposed. In view of the "hidden" characteristics of the RBCS (robot base coordinate system) and the FCS (flange coordinate system) in the measurement process, a comparatively general measurement and calibration method of the RBCS and the FCS is proposed, and the source of the robot terminal position error is classified into three aspects: positioning error of industrial RBCS, kinematics parameter error of manipulator, and positioning error of industrial robot end FCS. The robot position error model is established, and the relation equation of the robot end position error and the industrial robot model parameter error is deduced. By solving the equation, the parameter error identification and the supplementary results are obtained, and the method of compensating the error by using the robot joint angle is realized. The Leica laser tracker is used to verify the calibration method on ABB IRB120 industrial robot. The experimental results show that the calibration method can effectively enhance the APA of the robot.


2021 ◽  
Vol 17 (2) ◽  
pp. 1-45
Author(s):  
Cheng Pan ◽  
Xiaolin Wang ◽  
Yingwei Luo ◽  
Zhenlin Wang

Due to large data volume and low latency requirements of modern web services, the use of an in-memory key-value (KV) cache often becomes an inevitable choice (e.g., Redis and Memcached). The in-memory cache holds hot data, reduces request latency, and alleviates the load on background databases. Inheriting from the traditional hardware cache design, many existing KV cache systems still use recency-based cache replacement algorithms, e.g., least recently used or its approximations. However, the diversity of miss penalty distinguishes a KV cache from a hardware cache. Inadequate consideration of penalty can substantially compromise space utilization and request service time. KV accesses also demonstrate locality, which needs to be coordinated with miss penalty to guide cache management. In this article, we first discuss how to enhance the existing cache model, the Average Eviction Time model, so that it can adapt to modeling a KV cache. After that, we apply the model to Redis and propose pRedis, Penalty- and Locality-aware Memory Allocation in Redis, which synthesizes data locality and miss penalty, in a quantitative manner, to guide memory allocation and replacement in Redis. At the same time, we also explore the diurnal behavior of a KV store and exploit long-term reuse. We replace the original passive eviction mechanism with an automatic dump/load mechanism, to smooth the transition between access peaks and valleys. Our evaluation shows that pRedis effectively reduces the average and tail access latency with minimal time and space overhead. For both real-world and synthetic workloads, our approach delivers an average of 14.0%∼52.3% latency reduction over a state-of-the-art penalty-aware cache management scheme, Hyperbolic Caching (HC), and shows more quantitative predictability of performance. Moreover, we can obtain even lower average latency (1.1%∼5.5%) when dynamically switching policies between pRedis and HC.


2021 ◽  
Vol 13 (5) ◽  
pp. 168781402110195
Author(s):  
Jianwen Guo ◽  
Xiaoyan Li ◽  
Zhenpeng Lao ◽  
Yandong Luo ◽  
Jiapeng Wu ◽  
...  

Fault diagnosis is of great significance to improve the production efficiency and accuracy of industrial robots. Compared with the traditional gradient descent algorithm, the extreme learning machine (ELM) has the advantage of fast computing speed, but the input weights and the hidden node biases that are obtained at random affects the accuracy and generalization performance of ELM. However, the level-based learning swarm optimizer algorithm (LLSO) can quickly and effectively find the global optimal solution of large-scale problems, and can be used to solve the optimal combination of large-scale input weights and hidden biases in ELM. This paper proposes an extreme learning machine with a level-based learning swarm optimizer (LLSO-ELM) for fault diagnosis of industrial robot RV reducer. The model is tested by combining the attitude data of reducer gear under different fault modes. Compared with ELM, the experimental results show that this method has good stability and generalization performance.


2018 ◽  
Vol 4 (12) ◽  
pp. 142 ◽  
Author(s):  
Hongda Shen ◽  
Zhuocheng Jiang ◽  
W. Pan

Hyperspectral imaging (HSI) technology has been used for various remote sensing applications due to its excellent capability of monitoring regions-of-interest over a period of time. However, the large data volume of four-dimensional multitemporal hyperspectral imagery demands massive data compression techniques. While conventional 3D hyperspectral data compression methods exploit only spatial and spectral correlations, we propose a simple yet effective predictive lossless compression algorithm that can achieve significant gains on compression efficiency, by also taking into account temporal correlations inherent in the multitemporal data. We present an information theoretic analysis to estimate potential compression performance gain with varying configurations of context vectors. Extensive simulation results demonstrate the effectiveness of the proposed algorithm. We also provide in-depth discussions on how to construct the context vectors in the prediction model for both multitemporal HSI and conventional 3D HSI data.


2021 ◽  
Author(s):  
Rens Hofman ◽  
Joern Kummerow ◽  
Simone Cesca ◽  
Joachim Wassermann ◽  
Thomas Plenefisch ◽  
...  

<p>The AlpArray seismological experiment is an international and interdisciplinary project to advance our understanding of geophysical processes in the greater Alpine region. The heart of the project consists of a large seismological array that covers the mountain range and its surrounding areas. To understand how the Alps and their neighbouring mountain belts evolved through time, we can only study its current structure and processes. The Eastern Alps are of prime interest since they currently demonstrate the highest crustal deformation rates. A key question is how these surface processes are linked to deeper structures. The Swath-D network is an array of temporary seismological stations complementary to the AlpArray network located in the Eastern Alps. This creates a unique opportunity to investigate high resolution seismicity on a local scale.</p><p>In this study, a combination of waveform-based detection methods was used to find small earthquakes in the large data volume of the Swath-D network. Methods were developed to locate the seismic events using semi-automatic picks, and estimate event magnitudes. We present an overview of the methods and workflow, as well as a preliminary overview of the seismicity in the Eastern Alps.</p>


Sign in / Sign up

Export Citation Format

Share Document