A New Generation of Pipeline Risk Algorithms

Author(s):  
W. Kent Muhlbauer ◽  
Derek Johnson ◽  
Elaine Hendren ◽  
Steve Gosse

While the previous generation of scoring-type algorithms have served us (the industry) well, the associated technical compromises can be troublesome in today’s environment of increasing regulatory and public oversight. Risk analyses often become the centerpiece of any legal, regulatory, or public proceedings. This prompts the need for analysis techniques that can produce risk estimates anchored in absolute terms, such as “consequences per mile year”. Accordingly, a new generation of algorithms has been developed to meet today’s needs without costly re-vamping of previously collected data or increasing the costs of risk analysis. A simple re-grouping of variables into categories of “exposure”, “mitigation”, and ‘resistance’, along with a few changes in the mathematics of combining variables, transitions older scoring models into the new approach. The advantages of the new algorithms are significant since they: • are more intuitive and predictive, • better model reality, • lead to better risk management decisions by distinguishing between unmitigated exposure to a threat, mitigation effectiveness, and system resistance, • eliminate the need for unrealistic and troublesome reweighting or balancing of variables for changes such as new technologies, • offer flexibility to present results in either absolute (probabilistic) terms or relative terms, depending on the user’s needs. The challenge is to accomplish these without losing the advantages of earlier approaches. One of the intent of the new algorithms is to avoid overly-analytic techniques that often accompany more absolute quantifications of risk. This paper will showcase this new generation of algorithms to better suit the changing needs of risk analysis within the pipeline industry.

Author(s):  
Srushti Gajjar ◽  
Mrugendrasinh Rahevar

Innovation in IT and technology leads to new developments within the organization. It is important for companies to respond more quickly to the changing trends in order to stay competitive. ITIL change management allows companies to introduce new technologies without interruption or downtime. It follows a standard practice to avoid any unwanted interruptions and involves evaluation, planning and approval of changes. Change Management is all about managing risk for the company and it is linked to the perception of risk that the company has. Risk Analysis is primary component when it comes to any software changes; organizations are concerned about risk management. For better performance by identifying and assessing risk in systematic manner is the aim of the risk management. In ITIL change management risk assessment is a manual process. Automation of risk analysis would have enormous benefits, like reducing the downtime, maximize the productivity and so on. So this paper is mainly on the survey of different supervised machine algorithms of machine learning, like support vector machine, Naive Bayes, logistic regressions.


2016 ◽  
Vol 22 (3) ◽  
pp. 346-356 ◽  
Author(s):  
Hamidreza ABBASIANJAHROMI ◽  
Hossein RAJAIE ◽  
Eghbal SHAKERI ◽  
Omid KAZEMI

Various challenges such as new technologies, growing complexity and competitive environment, require the main contractor to assign some of the project’s tasks to other parties, the so-called subcontractors. Although subcontract­ing is a usual phenomenon in the construction industry, insufficient attention to the subcontractor selection strategy may pose some major threats to a project. Having in mind the significance of such risks, the optimization of subcontractor selection is essential for the success of the project. The importance of risk management in selecting subcontractors and the direct relation between risks and returns in most projects are two main motives for using the concept of portfolio in this paper. The main objective of this paper is to propose a model to allocate the best portion of project’s task to some subcontractors in order to reach the optimized portfolio of subcontractors and main contractor. This is a new approach in the subcontractor management; therefore, after presenting the model, an illustrative example will be presented for better understanding.


2014 ◽  
Vol 2 (2) ◽  
pp. 1333-1365
Author(s):  
A. Delonca ◽  
Y. Gunzburger ◽  
T. Verdel

Abstract. Rockfalls are major and essentially unpredictable sources of danger, particularly along transportation routes (roads and railways). Thus, assessment of their probabilities of occurrence is a major challenge for risk management. From a qualitative perspective, experience has shown that rockfalls occur mainly during periods of rain, snowmelt, or freeze–thaw. Nevertheless, from a quantitative perspective, these generally assumed correlations between rockfalls and their possible meteorological triggering events are often difficult to identify because (i) rockfalls are too rare for the use of classical statistical analysis techniques and (ii) all intensities of triggering factors do not have the same probability. In this study, we propose a new approach to investigate the correlation of rockfalls with rain, freezing periods, and strong temperature variations. This approach is tested on three French rockfall databases, the first of which exhibited a high frequency of rockfalls (approximately 950 events over 11 yr), whereas the other two databases were more common (approximately 140 events over 11 yr). These databases were for (1) the national highway RN1 on La-Réunion Island, (2) a railway in the Bourgogne region, and (3) a railway in the Auvergne region. Whereas a basic correlation analysis is only able to highlight an already obvious correlation in the case of the "rich" database, the newly suggested method appears to detect correlations in the "poor" databases. This new approach, easy to use, leads to identify the conditional probability of rockfall, according to the selected meteorological factor. It will help to optimize risk management in the considered areas with respect to their meteorological conditions.


2014 ◽  
Vol 14 (8) ◽  
pp. 1953-1964 ◽  
Author(s):  
A. Delonca ◽  
Y. Gunzburger ◽  
T. Verdel

Abstract. Rockfalls are a major and essentially unpredictable sources of danger, particularly along transportation routes (roads and railways). Thus, the assessment of their probability of occurrence is a major challenge for risk management. From a qualitative perspective, it is known that rockfalls occur mainly during periods of rain, snowmelt, or freeze–thaw. Nevertheless, from a quantitative perspective, these generally assumed correlations between rockfalls and their possible meteorological triggering events are often difficult to identify because (i) rockfalls are too rare for the use of classical statistical analysis techniques and (ii) not all intensities of triggering factors have the same probability. In this study, we propose a new approach for investigating the correlation of rockfalls with rain, freezing periods, and strong temperature variations. This approach is tested on three French rockfall databases, the first of which exhibits a high frequency of rockfalls (approximately 950 events over 11 years), whereas the other two databases are more typical (approximately 140 events over 11 years). These databases come from (1) national highway RN1 on Réunion, (2) a railway in Burgundy, and (3) a railway in Auvergne. Whereas a basic correlation analysis is only able to highlight an already obvious correlation in the case of the "rich" database, the newly suggested method appears to detect correlations even in the "poor" databases. Indeed, the use of this method confirms the positive correlation between rainfall and rockfalls in the Réunion database. This method highlights a correlation between cumulative rainfall and rockfalls in Burgundy, and it detects a correlation between the daily minimum temperature and rockfalls in the Auvergne database. This new approach is easy to use and also serves to determine the conditional probability of rockfall according to a given meteorological factor. The approach will help to optimize risk management in the studied areas based on their meteorological conditions.


1997 ◽  
Vol 488 ◽  
Author(s):  
P. N. Prasad ◽  
N. Deepak Kumar ◽  
Manjari Lal ◽  
Mukesh P. Joshi

AbstractNanoscale synthesis and processing provides a novel approach for making a new generation of nanocomposite materials with exceptional optical and electrical properties that are needed for the development of new technologies. This presentation will focus on the preparation of nanocomposites made of Poly (para-phenylene vinylene) (PPV) with other polymers, inorganic glasses and semiconductors. We will present a new approach of nanoscale polymerization for making processable monodispersed oligomeric species of PPV which uses the base catalyzed polymerization of PPV monomer within the cavity of a reverse micelle nanoreactor. Application of this approach of fabricating novel materials for a variety of applications in photonics will also be discussed. In addition, we will discuss fabrication of bulk nanocomposites of PPV and silica by insitu polymerization of monomer within a porous glass and their lasing properties.


2019 ◽  
Vol 16 (6) ◽  
pp. 60-77
Author(s):  
E. V. Vasilieva ◽  
T. V. Gaibova

This paper describes the method of project risk analysis based on design thinking and explores the possibility of its application for industrial investment projects. Traditional and suggested approaches to project risk management have been compared. Several risk analysis artifacts have been added to the standard list of artifacts. An iterative procedure for the formation of risk analysis artifacts has been developed, with the purpose of integrating the risk management process into strategic and prompt decision-making during project management. A list of tools at each stage of design thinking for risk management within the framework of real investment projects has been proposed. The suggested technology helps to determine project objectives and content and adapt them in regards to possible; as well as to implement measures aimed at reducing these risks, to increase productivity of the existing risk assessment and risk management tools, to organize effective cooperation between project team members, and to promote accumulation of knowledge about the project during its development and implementation.The authors declare no conflict of interest.


2006 ◽  
Vol 1 (2) ◽  
Author(s):  
B.H. MacGillivray ◽  
P.D. Hamilton ◽  
S.E. Hrudey ◽  
L. Reekie ◽  
S.J.T Pollard

Risk analysis in the water utility sector is fast becoming explicit. Here, we describe application of a capability model to benchmark the risk analysis maturity of a sub-sample of eight water utilities from the USA, the UK and Australia. Our analysis codifies risk analysis practice and offers practical guidance as to how utilities may more effectively employ their portfolio of risk analysis techniques for optimal, credible, and defensible decision making.


Author(s):  
David D. Nolte

Galileo’s parabolic trajectory launched a new approach to physics that was taken up by a new generation of scientists like Isaac Newton, Robert Hooke and Edmund Halley. The English Newtonian tradition was adopted by ambitious French iconoclasts who championed Newton over their own Descartes. Chief among these was Pierre Maupertuis, whose principle of least action was developed by Leonhard Euler and Joseph Lagrange into a rigorous new science of dynamics. Along the way, Maupertuis became embroiled in a famous dispute that entangled the King of Prussia as well as the volatile Voltaire who was mourning the death of his mistress Emilie du Chatelet, the lone female French physicist of the eighteenth century.


Electronics ◽  
2021 ◽  
Vol 10 (6) ◽  
pp. 739
Author(s):  
Nicholas Ayres ◽  
Lipika Deka ◽  
Daniel Paluszczyszyn

The vehicle-embedded system also known as the electronic control unit (ECU) has transformed the humble motorcar, making it more efficient, environmentally friendly, and safer, but has led to a system which is highly dependent on software. As new technologies and features are included with each new vehicle model, the increased reliance on software will no doubt continue. It is an undeniable fact that all software contains bugs, errors, and potential vulnerabilities, which when discovered must be addressed in a timely manner, primarily through patching and updates, to preserve vehicle and occupant safety and integrity. However, current automotive software updating practices are ad hoc at best and often follow the same inefficient fix mechanisms associated with a physical component failure of return or recall. Increasing vehicle connectivity heralds the potential for over the air (OtA) software updates, but rigid ECU hardware design does not often facilitate or enable OtA updating. To address the associated issues regarding automotive ECU-based software updates, a new approach in how automotive software is deployed to the ECU is required. This paper presents how lightweight virtualisation technologies known as containers can promote efficient automotive ECU software updates. ECU functional software can be deployed to a container built from an associated image. Container images promote efficiency in download size and times through layer sharing, similar to ECU difference or delta flashing. Through containers, connectivity and OtA future software updates can be completed without inconveniences to the consumer or incurring expense to the manufacturer.


Sign in / Sign up

Export Citation Format

Share Document