scholarly journals Autonomous Toy Drone via Coresets for Pose Estimation

Sensors ◽  
2020 ◽  
Vol 20 (11) ◽  
pp. 3042
Author(s):  
Soliman Nasser ◽  
Ibrahim Jubran ◽  
Dan Feldman

A coreset of a dataset is a small weighted set, such that querying the coreset provably yields a ( 1 + ε )-factor approximation to the original (full) dataset, for a given family of queries. This paper suggests accurate coresets ( ε = 0 ) that are subsets of the input for fundamental optimization problems. These coresets enabled us to implement a “Guardian Angel” system that computes pose-estimation in a rate > 20 frames per second. It tracks a toy quadcopter which guides guests in a supermarket, hospital, mall, airport, and so on. We prove that any set of n matrices in R d × d whose sum is a matrix S of rank r, has a coreset whose sum has the same left and right singular vectors as S, and consists of O ( d r ) = O ( d 2 ) matrices, independent of n. This implies the first (exact, weighted subset) coreset of O ( d 2 ) points to problems such as linear regression, PCA/SVD, and Wahba’s problem, with corresponding streaming, dynamic, and distributed versions. Our main tool is a novel usage of the Caratheodory Theorem for coresets, an algorithm that computes its set in time that is linear in its cardinality. Extensive experimental results on both synthetic and real data, companion video of our system, and open code are provided.

Sensors ◽  
2019 ◽  
Vol 19 (17) ◽  
pp. 3784 ◽  
Author(s):  
Jameel Malik ◽  
Ahmed Elhayek ◽  
Didier Stricker

Hand shape and pose recovery is essential for many computer vision applications such as animation of a personalized hand mesh in a virtual environment. Although there are many hand pose estimation methods, only a few deep learning based algorithms target 3D hand shape and pose from a single RGB or depth image. Jointly estimating hand shape and pose is very challenging because none of the existing real benchmarks provides ground truth hand shape. For this reason, we propose a novel weakly-supervised approach for 3D hand shape and pose recovery (named WHSP-Net) from a single depth image by learning shapes from unlabeled real data and labeled synthetic data. To this end, we propose a novel framework which consists of three novel components. The first is the Convolutional Neural Network (CNN) based deep network which produces 3D joints positions from learned 3D bone vectors using a new layer. The second is a novel shape decoder that recovers dense 3D hand mesh from sparse joints. The third is a novel depth synthesizer which reconstructs 2D depth image from 3D hand mesh. The whole pipeline is fine-tuned in an end-to-end manner. We demonstrate that our approach recovers reasonable hand shapes from real world datasets as well as from live stream of depth camera in real-time. Our algorithm outperforms state-of-the-art methods that output more than the joint positions and shows competitive performance on 3D pose estimation task.


2013 ◽  
Vol 284-287 ◽  
pp. 3111-3114
Author(s):  
Hsiang Chuan Liu ◽  
Wei Sung Chen ◽  
Ben Chang Shia ◽  
Chia Chen Lee ◽  
Shang Ling Ou ◽  
...  

In this paper, a novel fuzzy measure, high order lambda measure, was proposed, based on the Choquet integral with respect to this new measure, a novel composition forecasting model which composed the GM(1,1) forecasting model, the time series model and the exponential smoothing model was also proposed. For evaluating the efficiency of this improved composition forecasting model, an experiment with a real data by using the 5 fold cross validation mean square error was conducted. The performances of Choquet integral composition forecasting model with the P-measure, Lambda-measure, L-measure and high order lambda measure, respectively, a ridge regression composition forecasting model and a multiple linear regression composition forecasting model and the traditional linear weighted composition forecasting model were compared. The experimental results showed that the Choquet integral composition forecasting model with respect to the high order lambda measure has the best performance.


2015 ◽  
Vol 802 ◽  
pp. 676-681
Author(s):  
Siti Hafizan Hassan ◽  
Hamidi Abdul Aziz ◽  
Izwan Johari ◽  
Mohd Nordin Adlan

Waste generated in construction sites has recently increased and has become an uncontrollable cause of environmental problems and profit loss to contractors. The lack of real data or research on such wastes is due to the lack of suitable policies regarding this issue. The actions of contractors are not controlled by rules on this issue. This situation leads to the lack of action or awareness on the side of the contractor. Concrete waste is also part of the waste generated in construction sites. We determine the concrete waste generated in construction stages and conduct multiple linear regression analysis of the amount of column waste generated. The methodology employed in this study involves site observations, interviews with site personnel, and sampling at housing construction sites. The estimation method is utilized for the sampling of concrete waste. Results show that the average percentage of column waste is 13.93% and that of slab waste is 0.34%. These percentage values are derived from the total order of the concrete. The difference is due to the sizes of structures and method of handling. The regression model obtained from the sample data on column waste resulted in an adjustedR2value of 0.895. Therefore, the model predicts approximately 89.5% of the factors involved in concrete waste generation.


2015 ◽  
Vol 2 (8) ◽  
pp. 150255 ◽  
Author(s):  
Dongpo Xu ◽  
Cyrus Jahanchahi ◽  
Clive C. Took ◽  
Danilo P. Mandic

Quaternion derivatives exist only for a very restricted class of analytic (regular) functions; however, in many applications, functions of interest are real-valued and hence not analytic, a typical case being the standard real mean square error objective function. The recent HR calculus is a step forward and provides a way to calculate derivatives and gradients of both analytic and non-analytic functions of quaternion variables; however, the HR calculus can become cumbersome in complex optimization problems due to the lack of rigorous product and chain rules, a consequence of the non-commutativity of quaternion algebra. To address this issue, we introduce the generalized HR (GHR) derivatives which employ quaternion rotations in a general orthogonal system and provide the left- and right-hand versions of the quaternion derivative of general functions. The GHR calculus also solves the long-standing problems of product and chain rules, mean-value theorem and Taylor's theorem in the quaternion field. At the core of the proposed GHR calculus is quaternion rotation, which makes it possible to extend the principle to other functional calculi in non-commutative settings. Examples in statistical learning theory and adaptive signal processing support the analysis.


2021 ◽  
Vol 2 (1) ◽  
pp. 12-20
Author(s):  
Kayode Ayinde, Olusegun O. Alabi ◽  
Ugochinyere Ihuoma Nwosu

Multicollinearity has remained a major problem in regression analysis and should be sustainably addressed. Problems associated with multicollinearity are worse when it occurs at high level among regressors. This review revealed that studies on the subject have focused on developing estimators regardless of effect of differences in levels of multicollinearity among regressors. Studies have considered single-estimator and combined-estimator approaches without sustainable solution to multicollinearity problems. The possible influence of partitioning the regressors according to multicollinearity levels and extracting from each group to develop estimators that will estimate the parameters of a linear regression model when multicollinearity occurs is a new econometrics idea and therefore requires attention. The results of new studies should be compared with existing methods namely principal components estimator, partial least squares estimator, ridge regression estimator and the ordinary least square estimators using wide range of criteria by ranking their performances at each level of multicollinearity parameter and sample size. Based on a recent clue in literature, it is possible to develop innovative estimator that will sustainably solve the problem of multicollinearity through partitioning and extraction of explanatory variables approaches and identify situations where the innovative estimator will produce most efficient result of the model parameters. The new estimator should be applied to real data and popularized for use.


2012 ◽  
Vol 50 (2) ◽  
Author(s):  
Thuan V. Truong

Except in adverse weather conditions, congestion at large airport hubs appears to be predictable. This paper attempts to translate this predictability into a distribution of taxi-out times, a key component of airport congestion. When scheduled flights are chosen to define the dataset, taxi-out times follow a uniform distribution. This is not only the simplest distribution that inferences can be based on, but also a distribution that can be estimated by simple linear regression leading to very accurate forecasts. But above all, it is an invertible distribution function that can help solve a large class of stochastic optimization problems.


2021 ◽  
Author(s):  
Vishal Gupta ◽  
Nathan Kallus

Managing large-scale systems often involves simultaneously solving thousands of unrelated stochastic optimization problems, each with limited data. Intuition suggests that one can decouple these unrelated problems and solve them separately without loss of generality. We propose a novel data-pooling algorithm called Shrunken-SAA that disproves this intuition. In particular, we prove that combining data across problems can outperform decoupling, even when there is no a priori structure linking the problems and data are drawn independently. Our approach does not require strong distributional assumptions and applies to constrained, possibly nonconvex, nonsmooth optimization problems such as vehicle-routing, economic lot-sizing, or facility location. We compare and contrast our results to a similar phenomenon in statistics (Stein’s phenomenon), highlighting unique features that arise in the optimization setting that are not present in estimation. We further prove that, as the number of problems grows large, Shrunken-SAA learns if pooling can improve upon decoupling and the optimal amount to pool, even if the average amount of data per problem is fixed and bounded. Importantly, we highlight a simple intuition based on stability that highlights when and why data pooling offers a benefit, elucidating this perhaps surprising phenomenon. This intuition further suggests that data pooling offers the most benefits when there are many problems, each of which has a small amount of relevant data. Finally, we demonstrate the practical benefits of data pooling using real data from a chain of retail drug stores in the context of inventory management. This paper was accepted by Chung Piaw Teo, Special Issue on Data-Driven Prescriptive Analytics.


2021 ◽  
Vol 48 (3) ◽  
Author(s):  
Shokrya Saleh Alshqaq ◽  

The least trimmed squares (LTS) estimation has been successfully used in the robust linear regression models. This article extends the LTS estimation to the Jammalamadaka and Sarma (JS) circular regression model. The robustness of the proposed estimator is studied and the used algorithm for computation is discussed. Simulation studied, and real data show that the proposed robust circular estimator effectively fits JS circular models in the presence of vertical outliers and leverage points.


Sign in / Sign up

Export Citation Format

Share Document