4D Inspection: A Comprehensive Platform to Digitize Pipeline Construction Inspection and Generate Data Driven Continuous Improvement

Author(s):  
Sukhi Gill ◽  
Brett Vogt

Abstract Beginning in 2018, TC Energy began an effort to digitize inspection and construction management and internally named this concept the Dynamic, Digital, Data and Diagnostics platform otherwise known as “4D Inspection”. The 4D Inspection platform is built upon Project Consulting Service’s Epilogue® energy infrastructure construction management solution and is intended to evolve inspection reporting to a digital platform to create standardized construction reporting embedded with real-time compliance validations for efficient and effective management of field construction issues, progress tracking, and over-all construction quality monitoring. Currently, this effort is nearing the end of its first year of a multi-year implementation plan, the platform is in use on projects in both Canada and the United States spanning four different time zones with more than 595 inspectors on over $7.5 billion in capital projects. Even with implementation still underway, this concept’s key functionality, like automated inspection report document control, simplified photo capture with geo-tagging and automated daily progress reporting, has provided immediate benefits of more thorough, more reliable, and more efficient construction data than ever gathered using previous data collection processes. With improved data accuracy and detail, TC Energy gains insight and even foresight into how best to advance construction quality and safety while also impacting overall project costs and schedule.

1986 ◽  
Vol 108 (3) ◽  
pp. 432-439 ◽  
Author(s):  
B. J. Bodnaruk

The Great Plains Gasification Project is the first commercial-sized plant to produce substitute natural gas from coal in the United States. The plant is designed to convert 14,000 tons/D of North Dakota lignite into 137.5 million standard cubic feet of gas per day. The plant construction has been successfully completed per original design, on schedule and on budget. The plant has also been successfully turned over from construction to operations, as per the original plan. With the completion of the capital projects being implemented at the plant, plans are to achieve 70 percent stream factor in the first year of production (1985). The DOE-Chicago Operations Office has been assigned the responsibility for monitoring the project’s performance against baselines of cost, schedule, and technical criteria. During the startup phase of the project, significant technological advancements have been made and considerable knowledge has been gained, both by the operators and DOE (considering this to be a first of a kind plant built in the U.S.).


Author(s):  
Yochai Benkler ◽  
Robert Faris ◽  
Hal Roberts

This chapter presents the book’s macrolevel findings about the architecture of political communication and the news media ecosystem in the United States from 2015 to 2018. Two million stories published during the 2016 presidential election campaign are analyzed, along with another 1.9 million stories about Donald Trump’s presidency during his first year. The chapter examines patterns of interlinking between online media sources to understand the relations of authority and credibility among publishers, as well as the media sharing practices of Twitter and Facebook users to elucidate social media attention patterns. The data and mapping reveal not only a profoundly polarized media landscape but stark asymmetry: the right is more insular, skewed towards the extreme, and set apart from the more integrated media ecosystem of the center, center-left, and left.


2021 ◽  
pp. 155982762110181
Author(s):  
Sam Sugimoto ◽  
Drew Recker ◽  
Elizabeth E. Halvorson ◽  
Joseph A. Skelton

Background. Many diseases are linked to lifestyle in the United States, yet physicians receive little training in nutrition. Medical students’ prior knowledge of nutrition and cooking is unknown. Objective. To determine incoming medical students’ prior nutrition knowledge, culinary skills, and nutrition habits. Methods. A dual-methods study of first-year medical students. Cross-sectional survey assessing prior knowledge, self-efficacy, and previous education of cooking and nutrition. Interviews of second-year medical students explored cooking and nutrition in greater depth. Results. A total of 142 first-year medical students participated; 16% had taken a nutrition course, with majority (66%) learning outside classroom settings. Students had a mean score of 87% on the Nutritional Knowledge Questionnaire versus comparison group (64.9%). Mean cooking and food skills score were lower than comparison scores. Overall, students did not meet guidelines for fiber, fruit, vegetables, and whole grains. Interviews with second-year students revealed most learned to cook from their families; all believed it important for physicians to have this knowledge. Conclusions. Medical students were knowledgeable about nutrition, but typically self-taught. They were not as confident or skilled in cooking, and mostly learned from their family. They expressed interest in learning more about nutrition and cooking.


2018 ◽  
Vol 19 (2) ◽  
pp. 195-209 ◽  
Author(s):  
Zachary W. Taylor

This study examines first-year undergraduate admissions materials from 325 bachelor-degree granting U.S. institutions, closely analyzing the English-language readability and Spanish-language readability and translation of these materials. Via Yosso’s linguistic capital, the results reveal 4.9% of first-year undergraduate admissions materials had been translated into Spanish, 4% of institutional admissions websites embed translation widgets, and the average readability of English-language content is above the 13th-grade reading level. Implications for research and practice are discussed.


Author(s):  
Sarah L. Jackson ◽  
Sahar Derakhshan ◽  
Leah Blackwood ◽  
Logan Lee ◽  
Qian Huang ◽  
...  

This paper examines the spatial and temporal trends in county-level COVID-19 cases and fatalities in the United States during the first year of the pandemic (January 2020–January 2021). Statistical and geospatial analyses highlight greater impacts in the Great Plains, Southwestern and Southern regions based on cases and fatalities per 100,000 population. Significant case and fatality spatial clusters were most prevalent between November 2020 and January 2021. Distinct urban–rural differences in COVID-19 experiences uncovered higher rural cases and fatalities per 100,000 population and fewer government mitigation actions enacted in rural counties. High levels of social vulnerability and the absence of mitigation policies were significantly associated with higher fatalities, while existing community resilience had more influential spatial explanatory power. Using differences in percentage unemployment changes between 2019 and 2020 as a proxy for pre-emergent recovery revealed urban counties were hit harder in the early months of the pandemic, corresponding with imposed government mitigation policies. This longitudinal, place-based study confirms some early urban–rural patterns initially observed in the pandemic, as well as the disparate COVID-19 experiences among socially vulnerable populations. The results are critical in identifying geographic disparities in COVID-19 exposures and outcomes and providing the evidentiary basis for targeting pandemic recovery.


2014 ◽  
Vol 41 (6) ◽  
pp. 499 ◽  
Author(s):  
David J. Will ◽  
Karl J. Campbell ◽  
Nick D. Holmes

Context Worldwide, invasive vertebrate eradication campaigns are increasing in scale and complexity, requiring improved decision making tools to achieve and validate success. For managers of these campaigns, gaining access to timely summaries of field data can increase cost-efficiency and the likelihood of success, particularly for successive control-event style eradications. Conventional data collection techniques can be time intensive and burdensome to process. Recent advances in digital tools can reduce the time required to collect and process field information. Through timely analysis, efficiently collected data can inform decision making for managers both tactically, such as where to prioritise search effort, and strategically, such as when to transition from the eradication phase to confirmation monitoring. Aims We highlighted the advantages of using digital data collection tools, particularly the potential for reduced project costs through a decrease in effort and the ability to increase eradication efficiency by enabling explicit data-informed decision making. Methods We designed and utilised digital data collection tools, relational databases and a suite of analyses during two different eradication campaigns to inform management decisions: a feral cat eradication utilising trapping, and a rodent eradication using bait stations. Key results By using digital data collection during a 2-year long cat eradication, we experienced an 89% reduction in data collection effort and an estimated USD42 845 reduction in total costs compared with conventional paper methods. During a 2-month rodent bait station eradication, we experienced an 84% reduction in data collection effort and an estimated USD4525 increase in total costs. Conclusions Despite high initial capital costs, digital data collection systems provide increasing economics as the duration and scale of the campaign increases. Initial investments can be recouped by reusing equipment and software on subsequent projects, making digital data collection more cost-effective for programs contemplating multiple eradications. Implications With proper pre-planning, digital data collection systems can be integrated with quantitative models that generate timely forecasts of the effort required to remove all target animals and estimate the probability that eradication has been achieved to a desired level of confidence, thus improving decision making power and further reducing total project costs.


2021 ◽  
Author(s):  
Katie Wampler ◽  
Kevin D. Bladon ◽  
Monireh Faramarzi

<p>Forested watersheds are critical sources of the majority of the world’s drinking water. Almost one-third of the world’s largest cities and two-thirds of cities in the United States (US) rely on forested watersheds for their water supply. These forested regions are vulnerable to the increasing incidence of large and severe wildfires due to increases in regional temperatures and greater accumulation of fuels. When wildfires occur, increases in suspended sediment and organic carbon can negatively affect aquatic ecosystem health and create many costly challenges to the drinking water treatment process. These effects are often largest in the first year following a wildfire. While past research has shown the likelihood of source water impacts from wildfire, the magnitude of effects remains uncertain in most regions. In our study, we will quantify the projected short-term effects of three large (>70,000 ha) wildfires on key water quality parameters (sediment and organic carbon) in two important forested source watersheds in the Cascade Range of Oregon, US. We calibrated and validated a modified Soil and Water Assessment Tool (SWAT) to simulate streamflow, sediment loads and transport, as well as in-stream organic carbon processes for a historical period prior to wildfire. The calibrated model parameters were then modified based on literature values and burn severity maps to represent post-fire conditions of the three large wildfires. The parameter adjustments for simulating wildfire will be validated with post-fire water quality field samples from the wildfires. We will present estimations of future water quality impacts in the burned watersheds under different precipitation conditions at a daily scale for the first year following the wildfires, which will provide testable hypotheses. Additionally, we will determine catchment characteristics most critical in determining the post-fire water quality response. This work will help predict the magnitude of effects from these historic wildfires, which can inform forest and drinking water management decision making.</p>


1984 ◽  
Vol 74 (5) ◽  
pp. 1623-1643
Author(s):  
Falguni Roy

Abstract A depth estimation procedure has been described which essentially attempts to identify depth phases by analyzing multi-station waveform data (hereafter called level II data) in various ways including deconvolution, prediction error filtering, and spectral analysis of the signals. In the absence of such observable phases, other methods based on S-P, ScS-P, and SKS-P travel times are tried to get an estimate of the source depth. The procedure was applied to waveform data collected from 31 globally distributed stations for the period between 1 and 15 October 1980. The digital data were analyzed at the temporary data center facilities of the National Defense Research Institute, Stockholm, Sweden. During this period, a total number of 162 events in the magnitude range 3.5 to 6.2 were defined by analyzing first arrival time data (hereafter called level I data) alone. For 120 of these events, it was possible to estimate depths using the present procedure. The applicability of the procedure was found to be 100 per cent for the events with mb > 4.8 and 88 per cent for the events with mb > 4. A comparison of level I depths and level II depths (the depths as obtained from level I and level II data, respectively) with that of the United States Geological Survey estimates indicated that it will be necessary to have at least one local station (Δ < 10°) among the level I data to obtain reasonable depth estimates from such data alone. Further, it has been shown that S wave travel times could be successfully utilized for the estimation of source depth.


Sign in / Sign up

Export Citation Format

Share Document