Predicting Remaining Driving Time and Distance of a Planetary Rover Under Uncertainty

Author(s):  
Matthew Daigle ◽  
Shankar Sankararaman

The operations of a planetary rover depend critically upon the amount of power that can be delivered by its batteries. In order to plan the future operation, it is important to make reliable predictions regarding the end-of-discharge (EOD) time, which can be used to estimate the remaining driving time (RDT) and remaining driving distance (RDD). These quantities are stochastic in nature, not only because there are several sources of uncertainty that affect the rover’s operation but also since the future operating conditions cannot be known precisely. This paper presents a computational methodology to predict these stochastic quantities, based on a model of the rover and its batteries. We utilize a model-based prognostics framework that characterizes and incorporates the various sources of uncertainty into these predictions, thereby assisting operational decision-making. We consider two different types of driving scenarios and develop methods for each to characterize the associated uncertainty. Monte Carlo sampling and the inverse first-order reliability method are used to compute the stochastic predictions of EOD time, RDT, and RDD.

2006 ◽  
Vol 110 ◽  
pp. 221-230 ◽  
Author(s):  
Ouk Sub Lee ◽  
Dong Hyeok Kim ◽  
Seon Soon Choi

The reliability estimation of buried pipeline with corrosion defects is presented. The reliability of corroded pipeline has been estimated by using a theory of probability of failure. And the reliability has been analyzed in accordance with a target safety level. The probability of failure is calculated using the FORM (first order reliability method). The changes in probability of failure corresponding to three corrosion models and eight failure pressure models are systematically investigated in detail. It is highly suggested that the plant designer should select appropriate operating conditions and design parameters and analyze the reliability of buried pipeline with corrosion defects according to the probability of failure and a required target safety level. The normalized margin is defined and estimated accordingly. Furthermore, the normalized margin is used to predict the failure probability using the fitting lines between failure probability and normalized margin.


2003 ◽  
Vol 1 (1) ◽  
pp. 6-15 ◽  
Author(s):  
Sergey G Inge-Vechtomov

Discovery of DNA double structure is a symbol of establishment of the template principle in biology of the XX century. Template processes (replication, transcription, translation) have several common characteristics: they proceed in three consequent steps - initiation, elongation and termination and are followed by correction or repair. All of them possess the character of polyvarian-cy, which means that they arc carried by enzymatic systems composed of interchangeable components, which operate with different precision. There may be enzymatic components, identical or closely related by structure, which are involved in different template processes. Along with linear templates (DNA, RNA) so called first order templates there are space or conformational templates in the cell. The latter ones are represented by some proteins, which can change their conformation, memorize it and transfer to newly synthesized homologous polypeptides (second order templates). The second order templates may interact either with each other or with the first order templates. Knowledge about relations between different template processes in the cell brings a new glance on mutual influence of different types of variability and on their roles in evolution.


2015 ◽  
pp. 4-12
Author(s):  
Elena V. Nikolaeva

The article analyzes the correlation between the screen reality and the first-order reality in the digital culture. Specific concepts of the scientific paradigm of the late 20th century are considered as constituent principles of the on-screen reality of the digital epoch. The study proves that the post-non-classical cultural world view, emerging from the dynamic “chaos” of informational and semantic rows of TV programs and cinematographic narrations, is of a fractal nature. The article investigates different types of fractality of the TV content and film plots, their inner and outer “strange loops” and artistic interpretations of the “butterfly effect”.


1994 ◽  
Vol 161 ◽  
pp. 385-400
Author(s):  
B.G. Marsden

Past surveys are described in the logical sequence of (1) comets visually, (2) asteroids visually, (3) asteroids photographically and (4) comets photographically. Plots show the evolution of asteroid surveys in terms of visual discovery magnitude and ecliptic latitude, and similarities and differences between surveys for the different types of body are discussed. The paper ends with a brief discussion of more recent discovery methods and some thoughts on the future.


2020 ◽  
Vol 0 (0) ◽  
Author(s):  
Thomas George ◽  
V. Ganesan

AbstractThe processes which contain at least one pole at the origin are known as integrating systems. The process output varies continuously with time at certain speed when they are disturbed from the equilibrium operating point by any environment disturbance/change in input conditions and thus they are considered as non-self-regulating. In most occasions this phenomenon is very disadvantageous and dangerous. Therefore it is always a challenging task to efficient control such kind of processes. Depending upon the number of poles present at the origin and also on the location of other poles in transfer function different types of integrating systems exist. Stable first order plus time delay systems with an integrator (FOPTDI), unstable first order plus time delay systems with an integrator (UFOPTDI), pure integrating plus time delay (PIPTD) systems and double integrating plus time delay (DIPTD) systems are the classifications of integrating systems. By using a well-controlled positioning stage the advances in micro and nano metrology are inevitable in order satisfy the need to maintain the product quality of miniaturized components. As proportional-integral-derivative (PID) controllers are very simple to tune, easy to understand and robust in control they are widely implemented in many of the chemical process industries. In industries this PID control is the most common control algorithm used and also this has been universally accepted in industrial control. In a wide range of operating conditions the popularity of PID controllers can be attributed partly to their robust performance and partly to their functional simplicity which allows engineers to operate them in a simple, straight forward manner. One of the accepted control algorithms by the process industries is the PID control. However, in order to accomplish high precision positioning performance and to build a robust controller tuning of the key parameters in a PID controller is most inevitable. Therefore, for PID controllers many tuning methods are proposed. the main factors that lead to lifetime reduction in gain loss of PID parameters are described in This paper and also the main methods used for gain tuning based on optimization approach analysis is reviewed. The advantages and disadvantages of each one are outlined and some future directions for research are analyzed.


2021 ◽  
pp. 875697282199534
Author(s):  
Natalya Sergeeva ◽  
Graham M. Winch

This article develops a framework for applying organizational narrative theory to understand project narratives that potentially perform and change the future. Project narratives are temporal but often get repeated throughout the project life cycle to stabilize meaning, and could be about project mission, vision, identity, value creation, and so forth. Project narratives have important implications for organizational identity and image crafting. This article differentiates among different types of project narratives in relation to a project life cycle, providing case studies of project narratives on three major UK rail projects. We then set out the future research agenda into project narrative work.


2001 ◽  
Vol 101 (1) ◽  
pp. 19-31 ◽  
Author(s):  
Gerard Goggin ◽  
Catherine Griff

Much of the present debate about content on the internet revolves around how to control the distribution of different sorts of harmful or undesirable material. Yet there are considerable issues about whether sufficient sorts of desired cultural content will be available, such as ‘national’, ‘Australian’ content. In traditional broadcasting, regulation has been devised to encourage or mandate different types of content, where it is believed that the market will not do so by itself. At present, such regulatory arrangements are under threat in television, as the Productivity Commission Broadcasting Inquiry final report has noted. But what of the future for certain types of content on the internet? Do we need specific regulation and policy to promote the availability of content on the internet? Or is such a project simply irrelevant in the context of gradual but inexorable media convergence? Is regulating for content just as quixotic and fraught with peril as regulating of content from a censorship perspective often appears to be? In this article, we consider the case of Australian content for broadband technologies, especially in relation to film and video, and make some preliminary observations on the promotion and regulation of internet content.


Author(s):  
Tarald O. Kvålseth

First- and second-order linear models of mean movement time for serial arm movements aimed at a target and subject to preview constraints and lateral constraints were formulated as extensions of the so-called Fitts's law of motor control. These models were validated on the basis of experimental data from five subjects and found to explain from 80% to 85% of the variation in movement time in the case of the first-order models and from 93% to 95% of such variation for the second-order models. Fitts's index of difficulty (ID) was generally found to contribute more to the movement time than did either the preview ID or the lateral ID defined. Of the different types of errors, target overshoots occurred far more frequently than undershoots.


2014 ◽  
Vol 136 (3) ◽  
Author(s):  
C. Jiang ◽  
G. Y. Lu ◽  
X. Han ◽  
R. G. Bi

Compared with the probability model, the convex model approach only requires the bound information on the uncertainty, and can make it possible to conduct the reliability analysis for many complex engineering problems with limited samples. Presently, by introducing the well-established techniques in probability-based reliability analysis, some methods have been successfully developed for convex model reliability. This paper aims to reveal some different phenomena and furthermore some severe paradoxes when extending the widely used first-order reliability method (FORM) into the convex model problems, and whereby provide some useful suggestions and guidelines for convex-model-based reliability analysis. Two FORM-type approximations, namely, the mean-value method and the design-point method, are formulated to efficiently compute the nonprobabilistic reliability index. A comparison is then conducted between these two methods, and some important phenomena different from the traditional FORMs are summarized. The nonprobabilistic reliability index is also extended to treat the system reliability, and some unexpected paradoxes are found through two numerical examples.


Sign in / Sign up

Export Citation Format

Share Document