Uncertainty Analysis for Inclining Tests

Author(s):  
Joel S. Sales ◽  
Paulo de Tarso T. Esperança ◽  
Sergio H. Sphaier ◽  
Christiane Machado

In this paper, we addressed the qualitative consequences on uncertainty for the execution of a inclining test of a Semi-submergible platform with mooring system and risers at production site and compared the results to the ones taken from typical inclining test procedures at sheltered waters, as defined by ASTM F1321. To accomplish that, we applied Uncertainty Analysis according to ISO procedures, by evaluating propagation of uncertainties from the measurements until the final calculations. We discussed the use of measurement devices for the variables of concern and performed some numerical simulations to address the mooring system restoring effects and its implications on uncertainty analysis. We used data from two semi-subs with different displacements and found small influence on final uncertainty from the addition of mooring system and risers. At end, we discussed how new technologies on data acquisition and filtering of signals can become an important tool to help in the safety of the offshore floating production units, by facilitating the verification of the updated center of gravity of them after important interventions.

Sensors ◽  
2020 ◽  
Vol 20 (23) ◽  
pp. 6926
Author(s):  
Luis Castillo-Henríquez ◽  
Mariana Brenes-Acuña ◽  
Arianna Castro-Rojas ◽  
Rolando Cordero-Salmerón ◽  
Mary Lopretti-Correa ◽  
...  

Biosensors are measurement devices that can sense several biomolecules, and are widely used for the detection of relevant clinical pathogens such as bacteria and viruses, showing outstanding results. Because of the latent existing risk of facing another pandemic like the one we are living through due to COVID-19, researchers are constantly looking forward to developing new technologies for diagnosis and treatment of infections caused by different bacteria and viruses. Regarding that, nanotechnology has improved biosensors’ design and performance through the development of materials and nanoparticles that enhance their affinity, selectivity, and efficacy in detecting these pathogens, such as employing nanoparticles, graphene quantum dots, and electrospun nanofibers. Therefore, this work aims to present a comprehensive review that exposes how biosensors work in terms of bacterial and viral detection, and the nanotechnological features that are contributing to achieving a faster yet still efficient COVID-19 diagnosis at the point-of-care.


2014 ◽  
Vol 536-537 ◽  
pp. 1101-1104
Author(s):  
Jin Xia Diao ◽  
Hai Dong Hu

This paper studies a residual current monitoring system; PLC and PC select a combination of hardware, and to identify the specific method for AC and DC small signal data acquisition. On the software side, gives the effect of a schematic diagram of a data processing program, summed up the real-time data acquisition methods section. In the PLC control test procedures, test procedures were analyzed for comparison with the given process flow diagram of the main test and the numerical results, the present system reduces the complexity of the control to improve the automation of the detection process.


Author(s):  
Mohammad Khosrowjerdi ◽  
James Aflaki

PC-based data acquisition systems are used in a wide variety of applications. In laboratories, in field services and in manufacturing facilities, these systems act as general-purpose measurement and control tools well suited for measuring voltage signals. Teaching and learning experiences may be enhanced by integrating new-technologies in the engineering curriculum, particularly in experimental-type courses. By installing plug-in data acquisition boards and signal-conditioning hardware, and appropriate software, the general-purpose computers become enormously flexible virtual-instruments with data acquisition and analysis capability. This paper describes a Computer-Aided Testing System which uses a commercially available A/D board to offer users a wide array of measurement and control capabilities. It can be used for making repeated high speed measurement, controlling motors or teaching data acquisition.


Author(s):  
Felipe Schlemm Borgli ◽  
Mario Veiga Longa Junior ◽  
Ildemar Pinto Nunes

This technical paper has an objective to describe the remote operations of the Bolivia-Brazil gas pipeline and present how it is done in the Supervision and Control Center including new technologies normally used to operate remotely. The Bolivia-Brazil gas pipeline is the biggest pipeline in Latin America; the whole pipeline has 3,159 km length, where the most part of that is in the Brazilian ground of 2,593 km. TBG is a company that has the Brazilian trunk property and is responsible for its operations, which starts just at the Bolivian border. The pipeline started its operational activities in 1999 and was designed to operate with high technology and with the minimum field personal as possible. The pipeline has the maximum transport capacity of 30 million cubic meters per day. To reach this level is necessary to operate at the same time twelve compression stations along the pipeline and deliver gas for the customers in more than thirty-six-city-gates spread in five Brazilian states. From its Head Office in Rio de Janeiro, TBG controls remotely all of the pipeline’s operations along its entire length. It surveys the full extension of the pipeline 24 hours a day with two engineers, by satellite, using SCADA system (Supervisory Control and Data Acquisition software) and a plenty of online and offline simulation softwares including leak detection systems, pig monitoring, gas inventory management, look-ahead, predictive and trainer modules, thereby increasing operational reliability and improving decision-making process. Almost every operation is possible to be done by control engineers as like as starting-up turbines, increasing and decreasing compression rotations or even open or close valves. The technology VSAT (Very Small Aperture Terminal) and IP tunnels through the corporate WAN are both used to exchange data. The whole system now has more than 50,000 tags in real time database. Other important issue is that TBG doesn’t need any measurement staff in the field proceeding all the data acquisition. The whole measurement process is completely done at Supervision Control Room and at the same instant when data were collected the customer has the final certified volume in its hands.


2021 ◽  
Vol 251 ◽  
pp. 04019
Author(s):  
Andrei Kazarov ◽  
Adrian Chitan ◽  
Andrei Kazymov ◽  
Alina Corso-Radu ◽  
Igor Aleksandrov ◽  
...  

The ATLAS experiment at the Large Hadron Collider (LHC) operated very successfully in the years 2008 to 2018, in two periods identified as Run 1 and Run 2. ATLAS achieved an overall data-taking efficiency of 94%, largely constrained by the irreducible dead-time introduced to accommodate the limitations of the detector read-out electronics. Out of the 6% dead-time only about 15% could be attributed to the central trigger and DAQ system, and out of these, a negligible fraction was due to the Control and Configuration subsystem. Despite these achievements, and in order to improve even more the already excellent efficiency of the whole DAQ system in the coming Run 3, a new campaign of software updates was launched for the second long LHC shutdown (LS2). This paper presents, using a few selected examples, how the work was approached and which new technologies were introduced into the ATLAS Control and Configuration software. Despite these being specific to this system, many solutions can be considered and adapted to different distributed DAQ systems.


Author(s):  
Xiaoyang Kang ◽  
Hongchang Tian

In parallel with the development of new technologies that predicting and testing the performance and the reliability of MEMS, there has arisen a growing need for computer simulation. Besides the quality of the available data, such as temperature, the nonlinearities and the time-dependence of the materials properties etc., the accuracy of the simulation results depends substantially on the models for the simulation. The first part here is dedicated to the mathematical and physical models for the simulation of some typical MEMS assemblies. As part of this work, some typical assemblies as well as suitable models for their graphics and numerical representation are demonstrated. In the presentation, the results of the simulations of most of the relevant properties are reported. These assembly manipulations comprise aligning and inserting under typical service conditions. What’s more, new understandings of some manipulation are presented, which is very important to simplify the simulation of assembly. And a new model of MEMS numerical simulations is built in the degree of μm. These assembly manipulations will be divided into passive and initiative actions, which can never be divided in the MEMS assembly systems. The corresponding models for graphics and numerical simulations will be shown. Furthermore, the procedures to predict the performances of MEMS assemblies and interconnections are regarded. By using the typical assembly manipulations and the new understandings, it will be demonstrated that how these models can be utilized in designing MEMS.


2018 ◽  
Vol 38 (6) ◽  
pp. 857-863
Author(s):  
Ligen Yu ◽  
Guanghui Teng ◽  
Gerald L. Riskowski ◽  
Xuzhang Xu ◽  
Wenzhong Guo

Author(s):  
Timothy M. Maul ◽  
James F. Antaki ◽  
Jingchun Wu ◽  
Jeongho Kim ◽  
Marina V. Kameneva ◽  
...  

Mechanical circulatory support for the smallest newborn pediatric patients has historically been limited to extracorporeal membrane oxygenation, which can only provide several days to weeks of full cardiac support; far short of the median waiting time for pediatric heart transplantation of nearly three months [1]. Recently, new technologies have been developed, including the PediaFlow pediatric ventricular assist device, to address this need. The PediaFlow device is a magnetically levitated (mag lev), mixed flow turbodynamic blood pump which has been developed in large part in silico using CFD-based inverse design optimization and closed form rotor dynamics models [2, 3]. Each prototype undergoes a series of in vitro and in vivo tests to verify the accuracy of the simulations in predicting performance and biocompatibility. The overall goal is continued refinement and progress towards an implantable pump that produces 0.3 −1.5 L/min for up to 6 months in pediatric heart failure patients from 5 to 15 kg. We describe here the design principles and test procedures for the first three prototypes as well as the predicted performance for a fourth prototype currently being prepared for testing (Figure 1).


1972 ◽  
Vol 9 (03) ◽  
pp. 317-332
Author(s):  
E. R. Miller ◽  
W. T. Lindenmuth ◽  
W. E. Lehr ◽  
R. N. Abrahams

The experimental procedures used in the development of oil retention boom design criteria are presented in detail. Emphasis is placed on the procedures used to determine the oil containment ability and structural loads on the boom as a function of environmental conditions. The critical scaling parameters for oil containment tests are presented and the test procedures which have been developed are described. It is concluded that it is critical to scale both Froude and Weber number. Tests to determine structural loads can be conducted using standard ship towing tank procedures. However, it is necessary to scale the elastic properties of the boom and its mooring system. There are uncertainties with respect to some parameters which cannot be properly scaled in oil containment tests. Thus, carefully conducted full-scale trial results are required. Available procedures and current plans to obtain full-scale data are presented.


Sign in / Sign up

Export Citation Format

Share Document