Time-lapse electromagnetic and gravity methods in carbon storage monitoring

2021 ◽  
Vol 40 (6) ◽  
pp. 442-446
Author(s):  
Erika Gasperikova ◽  
Yaoguo Li

For geologic carbon storage (GCS), monitoring of the storage reservoir and detection of secondary plumes if they accumulate outside of the reservoir are important to confirm that the injected CO2 stays where intended. Seismic methods are most often applied but are expensive. Due to cost considerations, especially for long-term monitoring, less expensive techniques play a role when designing monitoring networks. In this article, the merits of gravity and electromagnetic (EM) methods as monitoring tools for GCS are presented. Many of the technologies are well established, and several new technologies are on the horizon. EM and gravity techniques are complementary to seismic methods and together provide better subsurface monitoring. Time-lapse multiphysics joint inversion, including seismic, EM, and gravity, could be a game changer for carbon storage monitoring. The trade-off between the sensitivity or resolution to a given plume size and the associated costs will be an important factor in selecting efficient and reliable monitoring arrays at GCS sites. Complex digital models representing geology encountered at storage sites can be used for this purpose and present another cost savings.

2021 ◽  
Vol 40 (6) ◽  
pp. 434-441
Author(s):  
Don White ◽  
Thomas M. Daley ◽  
Björn Paulsson ◽  
William Harbert

Borehole geophysical methods are a key component of subsurface monitoring of geologic CO2 storage sites because boreholes form a locus where geophysical measurements can be compared directly with the controlling geology. Borehole seismic methods, including intrawell, crosswell, and surface-to-borehole acquisition, are useful for site characterization, surface seismic calibration, 2D/3D time-lapse imaging, and microseismic monitoring. Here, we review the most common applications of borehole seismic methods in the context of storage monitoring and consider the role that detailed geophysical simulations can play in answering questions that arise when designing monitoring plans. Case study examples are included from the multitude of CO2 monitoring projects that have demonstrated the utility of borehole seismic methods for this purpose over the last 20 years.


2021 ◽  
Vol 17 (1) ◽  
Author(s):  
Antonio Bernabé-Ortiz ◽  
Jessica H. Zafra-Tanaka ◽  
Miguel Moscoso-Porras ◽  
Rangarajan Sampath ◽  
Beatrice Vetter ◽  
...  

AbstractA key component of any health system is the capacity to accurately diagnose individuals. One of the six building blocks of a health system as defined by the World Health Organization (WHO) includes diagnostic tools. The WHO’s Noncommunicable Disease Global Action Plan includes addressing the lack of diagnostics for noncommunicable diseases, through multi-stakeholder collaborations to develop new technologies that are affordable, safe, effective and quality controlled, and improving laboratory and diagnostic capacity and human resources. Many challenges exist beyond price and availability for the current tools included in the Package of Essential Noncommunicable Disease Interventions (PEN) for cardiovascular disease, diabetes and chronic respiratory diseases. These include temperature stability, adaptability to various settings (e.g. at high altitude), need for training in order to perform and interpret the test, the need for maintenance and calibration, and for Blood Glucose Meters non-compatible meters and test strips. To date the issues surrounding access to diagnostic and monitoring tools for noncommunicable diseases have not been addressed in much detail. The aim of this Commentary is to present the current landscape and challenges with regards to guidance from the WHO on diagnostic tools using the WHO REASSURED criteria, which define a set of key characteristics for diagnostic tests and tools. These criteria have been used for communicable diseases, but so far have not been used for noncommunicable diseases. Diagnostic tools have played an important role in addressing many communicable diseases, such as HIV, TB and neglected tropical diseases. Clearly more attention with regards to diagnostics for noncommunicable diseases as a key component of the health system is needed.


2021 ◽  
Vol 76 (1) ◽  
pp. 85-101
Author(s):  
Luca Dei Cas ◽  
Maria Luisa Pastore ◽  
Andrea Pavan ◽  
Nicola Petrella

Abstract. In areas located near large rock cliffs, risk reduction by early warning monitoring systems highligts potentiality but also critical issues and limits. The paper examines two rock slope failures that occurred in a short time from each other near inhabited areas in the Italian Alps. The viscous behavior of the rock mass was reconstructed through data processing from ground-based Synthetic Aperture Radar Interferometry (InSAR), and elaboration of acceleration and speed curves. Landslides types and underlying complexity associated with rock detachment mechanisms suggest the identification of precautionary alarm thresholds for collapse forecasting. The analysis of financial outlay, both for mitigation works and for monitoring activities, highlight the adequacy and the opportunity to combine passive systems, like embankments or rockfall drapery meshes, with a reliable monitoring network for early warning.


2019 ◽  
Author(s):  
Cesar Barajas-Olalde ◽  
Donald Adams ◽  
Lu Jin ◽  
Jun He ◽  
Nicholas Kalenze ◽  
...  

2021 ◽  
Author(s):  
Johanna Klahold ◽  
Christian Hauck ◽  
Florian Wagner

<p>Quantitative estimation of pore fractions filled with liquid water, ice and air is one of the prerequisites in many permafrost studies and forms the basis for a process-based understanding of permafrost and the hazard potential of its degradation in the context of global warming. The volumetric ice content is however difficult to retrieve, since standard borehole temperature monitoring is unable to provide any ice content estimation. Geophysical methods offer opportunities to image distributions of permafrost constituents in a non-invasive manner. A petrophysical joint inversion was recently developed to determine volumetric water, ice, air and rock contents from seismic refraction and electrical resistivity data. This approach benefits from the complementary sensitivities of seismic and electrical data to the phase change between ice and liquid water. A remaining weak point was the unresolved petrophysical ambiguity between ice and rock matrix. Within this study, the petrophysical joint inversion approach is extended along the time axis and respective temporal constraints are introduced. If the porosity (and other time-invariant properties like pore water resistivity or Archie exponents) can be assumed invariant over the considered time period, water, ice and air contents can be estimated together with a temporally constant (but spatially variable) porosity distribution. It is hypothesized that including multiple time steps in the inverse problem increases the ratio of data and parameters and leads to a more accurate distinction between ice and rock content. Based on a synthetic example and a field data set from an Alpine permafrost site (Schilthorn, Swiss Alps) it is demonstrated that the developed time-lapse petrophysical joint inversion provides physically plausible solutions, in particular improved estimates for the volumetric fractions of ice and rock. The field application is evaluated with independent validation data including thaw depths derived from borehole temperature measurements and shows generally good agreement. As opposed to the conventional petrophysical joint inversion, its time-lapse extension succeeds in providing reasonable estimates of permafrost degradation at the Schilthorn monitoring site without <em>a priori </em>constraints on the porosity model.</p>


Author(s):  
William E Downey ◽  
Lara M Cassidy ◽  
Kerstin Liebner ◽  
Robyn Magyar ◽  
Angela D Humphrey ◽  
...  

Introduction In the early 1960s, the creation of Cardiac Care Units (CCUs) led to a 50% reduction in the in-hospital mortality of acute myocardial infarction (AMI). Prompt application of closed chest cardiac resuscitation and external defibrillation -- then new technologies -- served to reduce the consequences of the event. Over the ensuing four decades, therapeutic advances in the treatment of AMI (e.g. prompt reperfusion strategies) have favorably altered its natural history, potentially obviating the need for CCU care. Since such care is expensive, identification of a low risk cohort of patients in whom this care is not necessary could allow substantial improvements in the cost of cardiac care. Hypothesis Existing risk models can be used to accurately identify low risk STEMI patients who do not require CCU care after primary PCI. Methods We performed a retrospective chart review of all STEMI cases from 2010 at Carolinas Medical Center. We then assessed them using the TIMI STEMI risk score and a risk assessment algorithm for uncomplicated STEMI developed at Brigham and Women's Hospital (BWH). The BWH STEMI Care Redesign defines low risk STEMI patients as those who are promptly revascularized via successful single vessel PCI with (1) no evidence of ongoing ischemia, (2) EF>40%, (3) absence of CHF, hemodynamic or electrical instability, and (4) who are awake without need of respiratory support. Cost data (fixed and variable) from Quality Advisor™, a product by Premier, was abstracted for each STEMI case, examining specific resources used in CCU and non-CCU units. Results Among 310 consecutive STEMI patients, in-hospital mortality was 3.9%. The BWH risk score identified 46.4% of these patients as low-risk. Among these patients, in-hospital mortality was 0%. Only one of these 144 low-risk patients required subsequent CCU care. None required CPR or defibrillation after revascularization. The TIMI STEMI risk score <2 classified 26.1% of the patients as low-risk. Among these patients, in-hospital mortality was 0%. However, 3.7% of these "low-risk" patients had ventricular arrhythmias or respiratory decompensation during or shortly after PCI. None of the 3.7% were classified as "low-risk" by the BWH model. CCU care added $723 in fixed costs and $340 in variable costs per hospital day. Conclusion The BWH model, but not the TIMI STEMI risk score, accurately predicted a sizable cohort of STEMI patients at very low risk of in-hospital death and complications. These patients may be appropriate for admission to non-CCU level care immediately following primary PCI. Doing so would be projected to yield a cost savings of >$1000 per patient.


2001 ◽  
Author(s):  
J.T. Fokkema ◽  
C.P.A. Wapenaar ◽  
M. Dillen ◽  
K. Schalkwijk
Keyword(s):  

Author(s):  
Jing Gao

This chapter will present evidence to show that there is an absence of informed, broad, media discussion on e-commerce initiatives in Australia. As pointed out by several authors (e.g., Gittins, 1995), the newspaper medium is one of the main vehicles through which advisers and policy makers seek to influence society. Thus this medium takes on the role of a public forum on national issues. However, it was found that newspapers in Australia have failed in their role of preparing manufacturing industries for the impact of new technologies. In this interpretive study, major Australian newspapers were examined for public discussions about e-commerce in manufacturing industries. The political-legal, economic, social, and technological (PEST) framework was used as a lens to subdivide issues, problems, and opportunities identified in the academic e-commerce literature. This lens was then used to examine 103 newspaper articles identified using the keywords Australian manufacturing and e-commerce in what was believed to be all the major Australian newspapers. It was found that some articles merely report vendors’ promises of potential cost savings while overlooking the need for investment in technology, training, and maintenance costs, while other discussions focused on “users as victims” issues such as security and privacy. In-depth issues such as reliability, communication protocols, bandwidth availability, and integration problems were overlooked. In particular, the problem of business strategies was ignored.


2020 ◽  
Author(s):  
Jan Friesen ◽  
John T. Van Stan II

&lt;p&gt;The first contact between precipitation and the land surface is often a plant canopy. The resulting precipitation partitioning by vegetation returns water back to the atmosphere (evaporation of intercepted precipitation) and redistributes water to the subcanopy surface as a &amp;#8220;drip&amp;#8221; flux (throughfall) and water that drains down plant stems (stemflow). Prior to the first benchmark publication of the field by Horton in 1919, European observatories and experimental stations had been observing precipitation partitioning since the mid-19th century. In this paper, we describe these early monitoring networks and studies of precipitation partitioning and show the impressive level of detail. Next to a description of the early studies, results included in this synthesis have been digitized and analyzed to compare them to recent studies. Although many early studies lack modern statistical analyses and monitoring tools that have become standard today, they had many strengths (not necessarily shared by every study, of course), including: A rigorous level of detail regarding stand characteristics (which is often lacking in modern ecohydrological studies); high-resolution spatiotemporal throughfall experiments; and chronosequential data collection and analysis. Moreover, these early studies reveal the roots of interest in precipitation partitioning processes and represent a generally forgotten piece of history shared by the hydrology, meteorology, forestry, and agricultural scientific communities. These studies are therefore relevant today and we hope modern scientists interested in plant-precipitation interactions will find new inspiration in our synthesis and evaluation of this literature.&lt;/p&gt;


2011 ◽  
Author(s):  
M. Karaoulis ◽  
A. Revil ◽  
D. D. Werkema

Sign in / Sign up

Export Citation Format

Share Document