Analysis of the National Energy Board Pipeline Integrity Performance Measures Data

Author(s):  
Joe Paviglianiti ◽  
Sarah Shaw

The National Energy Board’s (NEB or the Board) top priorities are the safety of people and protection of the environment. NEB-regulated pipelines have a very good safety record; however, the NEB noticed an increased trend in some types of incidents. Therefore, after considerable stakeholder consultation, in March 2012 the NEB started requiring NEB-regulated companies to report annually on new pipeline performance measures. These performance measures were developed and introduced to promote continual improvement in the management of pipelines by allowing companies to compare their results with industry aggregate numbers. In addition, these metric results allow the NEB to both evaluate and demonstrate that pipeline companies are effective in managing pipeline safety and protection of the environment. The NEB requires all regulated companies to report on incidents, such as releases of substances and serious injuries. Pipeline performance measures data provides the Board additional information such as lagging and leading indicators. These lagging indicators provide an historical view of company performance while the leading indicators provide forward looking data of potential future events. The NEB is of the view that an amalgamation of leading, lagging and qualitative measures can provide an overview of company effectiveness in meeting foundational management system program objectives. This paper examines four years of reported integrity related performance and integrity inspection data to evaluate trends in activities taken by companies to maintain safe pipelines. This paper will briefly discuss the challenges encountered when developing the measures, obtaining consistent data and evaluation of the data to identify trends. The paper will conclude by summarizing select results of the integrity performance measures and integrity inspection information data and discuss any potential future actions related to the pipeline performance integrity measures.

2012 ◽  
Vol 2 (1) ◽  
pp. 10-15
Author(s):  
Morten Jakobsen ◽  
Rainer Lueg

The Balanced Scorecard (BSC) claims to maximize organizational performance through the management of different perspectives (e.g., financial, customers, internal processes, learning & growth). Most of the chosen measures are usually non-financial, as they are supposedly leading indicators of financial success. The developers of the BSC Kaplan and Norton see these perspectives as related, but not as linked to each other by accounting logic. Moreover, Kaplan and Norton recommend cascading the BSC across the organization by breaking up the BSC into sub-targets for each organizational unit.Inevitably, this can lead to situations where actors in an organization focus on a subset of non-financial indicators. In their attempt to maximize these indicators, unit-egoism may lead to sub-optimal overall performance of the organization. This is because the link from non-financial indicators at lower levels of the organization to the overall financial goals have been disjoined. This problem, however, has been largely ignored in the BSC-literature. Therefore, this paper addresses the rationality and limits inherent in the usage of multiple performance measures. For this, we conduct an analytical study based on a literature review.


2004 ◽  
Vol 10 (5) ◽  
pp. 1079-1110 ◽  
Author(s):  
Y. Shiu

ABSTRACTDynamic financial analysis has become one of the important tools that actuaries use to model the underwriting and investment operations of insurance companies. The first step in carrying out the analysis is to investigate the most important factors affecting company performance. This paper identifies the determinants of the performance of United Kingdom general insurance companies using a panel data set consisting of economic data and Financial Services Authority/Department of Trade and Industry returns over the period 1986 to 1999. Three performance measures are used to capture different aspects of insurance operations. These measures are related to a number of economic and firm specific variables, chosen on the basis of relevant theory and literature. An ordinary least squares regression model and two panel data models are estimated for each of three performance measures. This paper also addresses several important econometric problems that are usually ignored in applied work in the context of panel data analysis. Based on the empirical results, this study finds that liquidity, unexpected inflation, interest rate level and underwriting profits are statistically significant determinants of the performance of U.K. general insurers.


2005 ◽  
Vol 80 (1) ◽  
pp. 243-268 ◽  
Author(s):  
Susan D. Krische

Schrand and Walther's (2000) archival evidence suggests that managers strategically disclose prior-period benchmarks in current earnings announcements, which, in turn, influences investors' judgments. Using a controlled experimental setting, I present evidence confirming that a transparent description of a transitory prior-period gain or loss affects how investors apply prior-period earnings when evaluating currentperiod earnings. I also provide evidence that this effect is likely to be unintentional on the part of investors, resulting from limitations in their memory for the prior-period event. Overall, the experimental results suggest that a quantitative description of the transitory prior-period gain or loss in a current earnings announcement helps investors to evaluate company performance. The results also highlight the need for consistency in reporting non-GAAP financial performance measures.


2021 ◽  
Vol 23 (1) ◽  
pp. 336
Author(s):  
Michele Provenzano ◽  
Raffaele Serra ◽  
Carlo Garofalo ◽  
Ashour Michael ◽  
Giuseppina Crugliano ◽  
...  

Chronic kidney disease (CKD) patients are characterized by a high residual risk for cardiovascular (CV) events and CKD progression. This has prompted the implementation of new prognostic and predictive biomarkers with the aim of mitigating this risk. The ‘omics’ techniques, namely genomics, proteomics, metabolomics, and transcriptomics, are excellent candidates to provide a better understanding of pathophysiologic mechanisms of disease in CKD, to improve risk stratification of patients with respect to future cardiovascular events, and to identify CKD patients who are likely to respond to a treatment. Following such a strategy, a reliable risk of future events for a particular patient may be calculated and consequently the patient would also benefit from the best available treatment based on their risk profile. Moreover, a further step forward can be represented by the aggregation of multiple omics information by combining different techniques and/or different biological samples. This has already been shown to yield additional information by revealing with more accuracy the exact individual pathway of disease.


2012 ◽  
Vol 5 (2) ◽  
pp. 44-74
Author(s):  
Maria Putintseva

Prediction (or information) markets are markets where participants trade contracts whose payoff depends on unknown future events. Studying prediction markets allows to avoid many problems, which arise in some artificially designed behavioral experiments investigating collective decision making or individual's belief formation. This work is aimed, first, to verify whether predictions made by prices of binary options traded in information markets are reliable and whether the prices contain additional information about the future comparing to the information available from the dynamics of underlying asset only. Second, inter- and intraday microstructure of the market of binary options on Dow Jones Industrial Average index is examined and described quantitatively. Third, since some ability to forecast future changes in the underlying asset is detected, a simple trading strategy based on observing the trading process in the prediction market is suggested and its profitability and applicability is evaluated.


1986 ◽  
Vol 11 (1) ◽  
pp. 19-26 ◽  
Author(s):  
Richard B. Robinson ◽  
Moragea Y. Salem ◽  
John E. Logan ◽  
John A. Pearce

This study examined the relationship between company performance and 50 specific planning activities in a small, independent retail firm setting. Performance was measured using one set of objective measures and one set of subjective measures. Six specific planning activities emerged as having significant relationships with both sets of performance measures.


Kybernetes ◽  
2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Dušan Gošnik ◽  
Igor Stubelj

Purpose This paper aims to examine the relationship between business process management (BPM) and company performance. The research focuses on the instrumental aspect of core business processes and its controlling activities in small and medium-sized companies (SMEs) to identify the relationship to company performance. Design/methodology/approach The results presented in this paper are based on a survey of Slovene SMEs. A questionnaire was distributed to 3007 SMEs via e-mail and a response rate of 5.42% was achieved. The financial data of companies over a six year period as derived from the publicly available financial reports of SMEs along with an industry-specific financial risk measure and other financial data were used for the company risk-adjusted performance measures of relative residual income (ROE-r) and risk-adjusted ROE (ROE-a) calculation. Findings The results show that instrumental aspects of core business process controlling activities are related to risk-adjusted company performance measures ROE-r and ROE-a. Companies with lower ROE-r and ROE-a have been perceived to be more focused on the instrumental aspect of BPM. Presumably due to the small sample, the results of a non-parametric Mann–Whitney U test did not statistically confirm the developed hypothesis: “the instrumental aspect of controlling as a core process management activity has a statistically significant impact on company risk-adjusted performance measures such as ROE-r and ROE-a.” Despite this, the results show a possible negative correlation between risk-adjusted performance measures and BPM, which opens possibilities for further research. Research limitations/implications The main limitation of the purposed study model is that the paper have studied only control activities of core business processes and relate it to company risk-adjusted performance measures. The study has been limited by the SME sample and the use of a survey as a research instrument. An additional limitation of the research is the degree of reliability implied by the assumptions of the models used to estimate the required return on equity and risk. Results concern investors, managers and practitioners to start BPM improvement initiatives, to set BPM priority measures and to set priority management decisions and further actions. Originality/value This paper presents the unique findings from an investigation of the instrumental aspects of BPM practices and their relationship to company risk-adjusted performance measures in SMEs. This paper developed a measurement instrument for measuring the instrumental aspects of BPM use. An additional original contribution is the use of company risk-adjusted performance measures such as ROE-r and ROE-a, which take into account the required profitability of companies in different industries according to the risk and allows comparable results of companies from different industries. The approach is innovative and interesting as regards researching the factors that affect the profitability of companies that operate in different industries.


2006 ◽  
Vol 17 (1) ◽  
pp. 61-68
Author(s):  
Carol J Johnson ◽  
Lidiya Sokhnich ◽  
Charles Ng

This paper explores the role that several supply chain dimensions play in achieving overall firm performance. Measures suggested in prior studies were factor analyzed for convergent and discriminant validity and then used in a regression model. This study uses data from the Council of Supply Chain Management Professionals (CSCMP) member firms, with top level supply chain managers as informants. The results suggest that of the three dimensions tested, two are significant contributors to firm profitability, including customer service and business process usage. Relationship confidence was not found to significantly impact overall firm performance.


2020 ◽  
Author(s):  
Samantha Engwell ◽  
Thomas Aubry ◽  
Sebastien Biass ◽  
Costanza Bonadonna ◽  
Marcus Bursik ◽  
...  

<p>Eruptive column models are crucial for managing volcanic crises, forecasting future events, and reconstructing past eruptions. Given their central role in volcanology and the large uncertainties weakening their predictions, the evaluation and improvement of these models is critical. Such evaluation is challenging as it requires independent estimates of the main model inputs (e.g. mass eruption rate) and outputs (e.g. column height). Despite recent efforts to extend datasets of independently estimated eruption source parameters (ESP) (e.g. Mastin 2014, Aubry et al. 2017), there is no standardized, maintained, and community-based ESP database devoted to the evaluation of eruptive column models.</p><p>Here we present a new ESP database designed to respond to the needs of the plume modelling community, and which will also be valuable to observatories, field volcanologists, and volcanic ash advisory centers. We compiled data for over 130 eruptive events with independent estimates of: i) the mass eruption rate; ii) the height reached by the column; and iii) atmospheric conditions during the eruption. In contrast with previous ESP datasets, we distinguish estimates of column height that relate to different phases (ash and SO2) and parts of the column (plume top or umbrella). We additionally provide the total grain size distribution, uncertainties in eruption parameters, and multiple sources for atmospheric profiles for events where these parameters are available. The database also includes a wealth of additional information which will enable modelers to distinguish between different eruptions when evaluating or calibrating models. This includes the type of eruption, the morphology of the plume (weak/transitional/strong), and the occurrence and mass entrained within pyroclastic density currents.</p><p>We will apply the new database to revisit empirical scaling relationships between the mass eruption rate and “plume height”. In particular, we will show how such relationships depend on the type of height (e.g. SO2 height vs. ash top height) and eruption (e.g. magmatic vs. phreatomagmatic) considered. We will also discuss the difficulties and limitations of compiling ESP estimates from the literature as well as characterizing fundamentally unsteady volcanic events by a single value for each ESP.</p>


2011 ◽  
Vol 9 (3) ◽  
pp. 69 ◽  
Author(s):  
Lawrence M. Metzger

Quality has become more important than ever in the production of goods and services. Money spent on quality must ultimately contribute to increased productivity within the firm. Quantifying the benefits of quality costs with respect to increased productivity is an elusive proposition. This paper describes the use of a linear programming technique called data envelopment analysis (DEA) to measure the effects of appraisal and prevention costs on productivity. DEA provide3s additional information to the normal profit driven performance measures and can be used to determine strategies to help improve the efficiency of money spent on quality.


Sign in / Sign up

Export Citation Format

Share Document