Recent Technological Advances Provide Highly Efficient and Reduced Risk Solutions for Conveying Wireline Formation Evaluation Toolstrings in Deepwater Operations

2021 ◽  
Author(s):  
Stephen McCormick ◽  
Rajesh Thatha ◽  
Martin Leonard ◽  
Samuel Escott ◽  
Adam Sedgwick ◽  
...  

Abstract Obtaining high resolution, quality formation evaluation data is still only possible with wireline logging. However, with the continued push into deeper and more complex drilling environments, many challenges have been placed in the way of wireline logging, including high tension, high deviation, and increased differential pressure. These factors contribute to an increased risk of tool sticking incidents and lost-in-hole scenarios. Several methods of mitigating these issues on surface (powered capstans, pipe conveyance, etc) have been implemented in the past, but none have been successful in reducing or eliminating the risk downhole without introducing further drawbacks. This paper describes how a new wireline conveyance system has eliminated these issues. The conveyance system consists of wheeled carriages that carry the toolstring off-centre. The mass of the toolstring acts as a counterweight to ensure correct tool orientation in the wellbore. This orientation feature also enables a "guide" device to help navigate ledges and washouts. Such a system eliminates toolstring hold ups, allows access to highly deviated wells without pipe conveyance or tractors, and significantly mitigates differential sticking hazards, while also offering additional benefits in operational efficiency and data quality. A case study from a particularly difficult well in New Zealand is presented. Data acquisition in this well was fraught with challenges: In addition to the 2000m tangent section at 67° deviation, well had severe borehole breakouts. Previous experience in similar scenarios with conventional data acquisition methods yeided poor results. The wheeled carriage system was deployed in multiple innovative configurations resulting in the acquisition of excellent quality data from five wireline descents in hole. This wireline conveyance system has been routinely deployed on multiple deepwater operations in the Gulf of Mexico. One such operation is presented where large gains in logging efficiency have been realised, particularly with the elimination of differential sticking risk and time-consuming pipe conveyed logging. The new technology takes a holistic approach to wireline tool conveyance: Prevent sticking issues using wheeled carriages and mitigate fishing risk using ultra-high strength wireline cables. Wheeled carriages greatly reduce the tool-borehole contact area, preventing the incidence of tool sticking. In addition, wheeled carriages reduce drag while ensuring optimum data quality by sensor position and orientation within the wellbore. Ultra-high strength cables provide ability to log at very high tensions and at the same time provide high overpull capability. The result is a safe, efficient, cost effective and complete Wireline data acquisition.

2019 ◽  
Vol 3 (1) ◽  
Author(s):  
Prakash Raj Neupane

Cancer Care is a heavily invested and researched area in present context of development of medical science. As the burden of disease is increasing and treatment options are limited, especially in economically deprived regions, this is demanding more viable, modern and cost effective methods for diagnosis and treatment. This is driving to the need of more locally conducted research, more technical collaboration with advanced institutes and of course publishing of high quality data. We are gradually adapting new technology like Liquid based biopsy in diagnostics, targeted/ Immunotherapy in treatment and artificial intelligence in some therapies. These have proven effective and useful but at the huge cost of financial burden. New therapeutic modalities like BMT are emerging and gained importance in recent years. Cancer etiogenesis has heterogeneous components and this disease has very unpredictable biological behavior. Understanding of molecular biology, genetics, gene sequencing has given us tremendous amount of information for prognostics and prediction of treatment methods. Proper explanation and interpretation of so much so of information is very crucial. Clinicians and scientists should be learning and there should be provision for molecular tumor board as well. Slowly but gradually, the new generation care givers of the field of oncology should keep them at the edge of technology and should be able to understand the recent developments to give individualized care to patient and optimal use of available technology to tackle the disease. Writing and publishing in more standard formats is another important issue to be learned so as to communicate with the peers world wide. So we encourage the young oncologists to learn in this regard as well.


2016 ◽  
Vol 56 (2) ◽  
pp. 601
Author(s):  
Nabeel Yassi

The desire to conduct onshore seismic surveys without cables has been an elusive dream since the dawn of seismic exploration. Since the late 1970s, seismic surveys were conducted with cabled multi-channels acquisition systems. As the number of channels steadily grew, a fundamental restriction appeared with hundreds of kilometres of line cables dragged on the ground. Seismic surveys within rugged terrain—across rivers, steep cliffs, urban areas, and culturally and environmentally sensitive zones—were both challenging and expansive exercises. Modern technology has made different cable-free solutions practical. High-resolution analogue to digital converters are now affordable, as are GPS radios for timing and location. Microprocessors and memory are readily available for autonomous recording systems, along with a battery the size and weight of a field nodal now promising to power an acquisition unit for as long as required for normal seismic crew operations. Many successful 2D and 3D seismic data acquisition using cable-free autonomous nodal systems were attempted in the past few years; however, there remain a number of concerns with these systems. The first concern queries whether the units are working according to manufacturer specifications during the data acquisition window. The second is the limited or no real-time data quality control that inspires sceptics to use the term blind acquisition to nodal operations. The third is the traditional question of geophone array versus point receiver acquisition. Although a string of the geophone can be connected to autonomous nodes, the preference is to deploy a single or internal geophone with the nodes to maintain the proposed flexibility of cable-free recording systems. This case study elaborates on the benefits of the cable-free seismic surveys, with specific examples of 2D and 3D exploration programs conducted in Australia in the past few years. Optimisation of field crew size, field crew resources, cost implications, and footprint to the environment, wildlife and domestic livestock will be discussed. In addition, the study focuses on the data quality/data assurance and the processes implanted during data acquisition to maintain equivalent industry standards to cable recording. Emphases will also include data analysis and test results of the geophone array versus the cable-free point receiver recording.


Author(s):  
Alan Glover ◽  
Joe Zhou ◽  
David Horsley ◽  
Nobuhisa Suzuki ◽  
Shigeru Endo ◽  
...  

Traditional pipeline technology will be severely challenged as design-operating pressures continue to rise and gas field developments occur in more remote locations including the arctic. Cost-effective solutions to these issues can be found through innovative designs using new technology and its implementation. Some of these designs have considered the use of high-pressure natural gas pipelines resulting in the development of high strength steel. In order to meet these increases in pressure TransCanada and JFE/NKK have been working extensively on the application of X100 (Grade 690) linepipe and this has culminated in the construction and installation of a X100 project in the fall of 2002. This paper will discuss the development of the related research projects that allowed the successful completion of the field project. The topics will include the material properties and fracture control plans for X100. In addition the approach to strain based design for X100 will include the analysis for both the tensile strain limits (weld mismatch consideration) and compressive strain limits (i.e. buckling capacity). The development of the field welding process will also be covered. The paper will discuss the implications of using X100 from the perspective of the successful field project and the application of a strain-based design.


2016 ◽  
Vol 72 (9) ◽  
pp. 1036-1048 ◽  
Author(s):  
Arnau Casanas ◽  
Rangana Warshamanage ◽  
Aaron D. Finke ◽  
Ezequiel Panepucci ◽  
Vincent Olieric ◽  
...  

The development of single-photon-counting detectors, such as the PILATUS, has been a major recent breakthrough in macromolecular crystallography, enabling noise-free detection and novel data-acquisition modes. The new EIGER detector features a pixel size of 75 × 75 µm, frame rates of up to 3000 Hz and a dead time as low as 3.8 µs. An EIGER 1M and EIGER 16M were tested on Swiss Light Source beamlines X10SA and X06SA for their application in macromolecular crystallography. The combination of fast frame rates and a very short dead time allows high-quality data acquisition in a shorter time. The ultrafine φ-slicing data-collection method is introduced and validated and its application in finding the optimal rotation angle, a suitable rotation speed and a sufficient X-ray dose are presented. An improvement of the data quality up to slicing at one tenth of the mosaicity has been observed, which is much finer than expected based on previous findings. The influence of key data-collection parameters on data quality is discussed.


2001 ◽  
Vol 4 (06) ◽  
pp. 489-501 ◽  
Author(s):  
D. Kandel ◽  
R. Quagliaroli ◽  
G. Segalini ◽  
B. Barraud

Summary The acquisition of gas in mud data while drilling for geological surveillance and safety is an almost universal practice. This source of data is only rarely used for formation evaluation because of the widely accepted presumption that it is unreliable and unrepresentative. Recent developments in the mud-logging industry to improve gas data acquisition and analysis have led to the availability of better quality data. Within a joint Elf/Eni-Agip Div. research program, a new interpretation method has been developed following the comprehensive analysis and interpretation of gas data from a wide range of wells covering different types of geological, petroleum, and drilling environments. The results, validated by correlation and comparison with other data such as logs, well tests, and pressure/volume temperature (PVT) data, enable us to characterize lithological changes; porosity variations and permeability barriers; seal depth, thickness, and efficiency; gas diffusion or leakage; gas/oil and hydrocarbon/water contacts; vertical changes in fluid over a thick monolayer pay zone; vertical fluid differentiation in multilayer intervals; and biodegradation. The comparison of surface gas, PVT, and geochemistry data clearly confirms the consistency between the drilling gas data (gas shows) and the corresponding reservoir fluid composition. The near real-time availability, at no extra acquisition cost, of such data has led to:The optimization of future well operations (such as logging and testing).A better integration of while-drilling data to the well evaluation process.A significant improvement in both early formation evaluation and reservoir studies, especially for the following applications, in which traditional log analysis often remains inconclusive:Very-low-porosity reservoirs.Thin beds.Dynamic barriers and seal efficiency.Low-resistivity pay.Light hydrocarbons. Examples show gas while drilling (GWD) wellsite quicklook interpretations with simple lithological and fluid interpretations, as well as more complex reservoir and fluid characterization applications in varied geographical and geological contexts; both demonstrate how GWD data are integrated with more standard data sets. Introduction The measurement of gas shows is standard practice during the drilling of exploration and development wells. Continuous gas monitoring sometimes enables us to indicate, in general terms, the presence of hydrocarbon-bearing intervals, but it rarely allows us to define the fluid types (oil, condensate and/or gas, and water). Gas data are at present largely underused because they are considered unreliable and not fully representative of the formation fluids. There are many reasons for this. On one hand, poorly established correlations exist between reservoir fluids and shows at surface; on the other hand, numerous drilling parameters strongly influence the recorded gas data, such as formation pressure, mud weight and type, gas-trap position in the shaker ditch, and mud-out temperatures. One reason may be the very low cost of such data, often equated with low value. Until a few years ago, the analysis performed on gas shows was generally restricted to the use of Pixler and/or Geoservices diagrams (or equivalent), wetness, balance, character, and gas normalization.1–4 Recent improvements in gas-acquisition technology and the new GWD methodology allow us to perform reservoir interpretation in near real time for fluid identification and contacts [oil/water contact (OWC), gas/oil contact (GOC), etc.], lithological changes, and barrier efficiency, thus allowing operations optimization (e.g., coring, wireline recording and sampling, and testing operations). It is also possible to integrate the GWD interpretation in reservoir, geochemical, PVT analysis, and comprehensive studies. Method Data Acquisition. The measurement of gas shows in the circulating drilling mud was introduced in the early days of mud logging (ML) with two objectives: first, as a safety device to indicate well behavior to drillers, and second, as an indicator of hydrocarbon-bearing zones. Today, gas-shows measurement is systematically acquired in the petroleum industry for the same reason, but it is seldom used to its full potential, mainly because of an ongoing prejudice that the data are not representative of the formation fluids and/or that the recording of these data is strongly influenced by varying drilling parameters. The ML gas system is composed of three parts:A "gas trap" to extract gas from the mud stream situated somewhere between the bell nipple and the shaker box (often in the latter).Lines, pumps, and filters enabling the transport of a dry-gas sample to the ML unit.A detection system in the ML unit. Recent efforts in the mud-logging industry to improve gas-data acquisition and analysis have led to the availability of better quality data, which has provided reliable lithological and fluid information since the 1990s. In the 1980s, most of the ML companies introduced the flame ionization detectors (FID) to replace previous total gas (TG) and chromatograph measurements. The TG measurement gives the total amount of hydrocarbon components extracted from the mud and burned in the detector. The TG could now be correlated with the C1-C5 readings from the new breed of chromatographs.5 Finally, over the past few years, several ML companies have introduced fast-gas chromatographs with improved resolution (C1-C5 in less than 1 minute), improved C1/C2 separation, and, above all, improved reliability and repeatability. High-speed chromatographs using a thermal-conductivity detector have also appeared on the market, but they were not tested within this project. Work carried out by Texaco in the early 1990s led to a significant improvement in basic trap design with the introduction of the quantitative gas measurement (QGM) trap, which was a major step in reducing the effect of environmental changes.6 An alternative proposition from Geoservices was to replace the trap, generally situated in the shaker box, with a pumping system supplying the trap with a constant volume of mud sucked from a probe situated close in the flowline to the bell nipple.7


Sensors ◽  
2019 ◽  
Vol 19 (9) ◽  
pp. 1978 ◽  
Author(s):  
Argyro Mavrogiorgou ◽  
Athanasios Kiourtis ◽  
Konstantinos Perakis ◽  
Stamatios Pitsios ◽  
Dimosthenis Kyriazis

It is an undeniable fact that Internet of Things (IoT) technologies have become a milestone advancement in the digital healthcare domain, since the number of IoT medical devices is grown exponentially, and it is now anticipated that by 2020 there will be over 161 million of them connected worldwide. Therefore, in an era of continuous growth, IoT healthcare faces various challenges, such as the collection, the quality estimation, as well as the interpretation and the harmonization of the data that derive from the existing huge amounts of heterogeneous IoT medical devices. Even though various approaches have been developed so far for solving each one of these challenges, none of these proposes a holistic approach for successfully achieving data interoperability between high-quality data that derive from heterogeneous devices. For that reason, in this manuscript a mechanism is produced for effectively addressing the intersection of these challenges. Through this mechanism, initially, the collection of the different devices’ datasets occurs, followed by the cleaning of them. In sequel, the produced cleaning results are used in order to capture the levels of the overall data quality of each dataset, in combination with the measurements of the availability of each device that produced each dataset, and the reliability of it. Consequently, only the high-quality data is kept and translated into a common format, being able to be used for further utilization. The proposed mechanism is evaluated through a specific scenario, producing reliable results, achieving data interoperability of 100% accuracy, and data quality of more than 90% accuracy.


2014 ◽  
Vol 4 (1) ◽  
pp. 23-29
Author(s):  
Constance Hilory Tomberlin

There are a multitude of reasons that a teletinnitus program can be beneficial, not only to the patients, but also within the hospital and audiology department. The ability to use technology for the purpose of tinnitus management allows for improved appointment access for all patients, especially those who live at a distance, has been shown to be more cost effective when the patients travel is otherwise monetarily compensated, and allows for multiple patient's to be seen in the same time slots, allowing for greater access to the clinic for the patients wishing to be seen in-house. There is also the patient's excitement in being part of a new technology-based program. The Gulf Coast Veterans Health Care System (GCVHCS) saw the potential benefits of incorporating a teletinnitus program and began implementation in 2013. There were a few hurdles to work through during the beginning organizational process and the initial execution of the program. Since the establishment of the Teletinnitus program, the GCVHCS has seen an enhancement in patient care, reduction in travel compensation, improvement in clinic utilization, clinic availability, the genuine excitement of the use of a new healthcare media amongst staff and patients, and overall patient satisfaction.


Author(s):  
Tanwi Singh ◽  
Anshuman Sinha

The major risk associated with low platelet count in pregnancy is the increased risk of bleeding during the childbirth or post that. There is an increased blood supply to the uterus during pregnancy and the surgical procedure requires cutting of major blood vessels. Women with thrombocytopenia are at increased risk of losing excessive blood. The risk is more in case of caesarean delivery as compared to vaginal delivery. Hence based on above findings the present study was planned for Assessment of the Platelet Count in the Pregnant Women in IGIMS, Patna, Bihar. The present study was planned in Department of Pathology, Indira Gandhi Institute of Medical Science, Patna, Bihar, India. The present study was planned from duration of January 2019 to June 2019. In the present study 200 pregnant females samples received for the platelet estimation were enrolled in the present study. Clinically platelet indices can be a useful screening test for early identification of preeclampsia and eclampsia. Also platelet indices can assess the prognosis of this disease in pregnant women and can be used as an effective prognostic marker because it correlates with severity of the disease. Platelet count is a simple, low cost, and rapid routine screening test. Hence the data generated from the present study concludes that platelet count can be used as a simple and cost effective tool to monitor the progression of preeclampsia, thereby preventing complications to develop during the gestational period. Keywords: Platelet Count, Pregnant Women, IGIMS, Patna, Bihar, etc.


Alloy Digest ◽  
2018 ◽  
Vol 67 (9) ◽  

Abstract Ferrium M54 was designed to create a cost-effective, ultra high-strength, high-fracture toughness material with a high resistance to stress-corrosion cracking for use in structural applications. This datasheet provides information on composition, hardness, and tensile properties as well asfatigue. Filing Code: SA-822. Producer or source: QuesTek Innovations, LLC.


2017 ◽  
Vol 4 (1) ◽  
pp. 25-31 ◽  
Author(s):  
Diana Effendi

Information Product Approach (IP Approach) is an information management approach. It can be used to manage product information and data quality analysis. IP-Map can be used by organizations to facilitate the management of knowledge in collecting, storing, maintaining, and using the data in an organized. The  process of data management of academic activities in X University has not yet used the IP approach. X University has not given attention to the management of information quality of its. During this time X University just concern to system applications used to support the automation of data management in the process of academic activities. IP-Map that made in this paper can be used as a basis for analyzing the quality of data and information. By the IP-MAP, X University is expected to know which parts of the process that need improvement in the quality of data and information management.   Index term: IP Approach, IP-Map, information quality, data quality. REFERENCES[1] H. Zhu, S. Madnick, Y. Lee, and R. Wang, “Data and Information Quality Research: Its Evolution and Future,” Working Paper, MIT, USA, 2012.[2] Lee, Yang W; at al, Journey To Data Quality, MIT Press: Cambridge, 2006.[3] L. Al-Hakim, Information Quality Management: Theory and Applications. Idea Group Inc (IGI), 2007.[4] “Access : A semiotic information quality framework: development and comparative analysis : Journal ofInformation Technology.” [Online]. Available: http://www.palgravejournals.com/jit/journal/v20/n2/full/2000038a.html. [Accessed: 18-Sep-2015].[5] Effendi, Diana, Pengukuran Dan Perbaikan Kualitas Data Dan Informasi Di Perguruan Tinggi MenggunakanCALDEA Dan EVAMECAL (Studi Kasus X University), Proceeding Seminar Nasional RESASTEK, 2012, pp.TIG.1-TI-G.6.


Sign in / Sign up

Export Citation Format

Share Document