Revamping, Energy Efficiency, and Exergy Analysis of an Existing Upstream Gas Treatment Facility

2011 ◽  
Vol 133 (1) ◽  
Author(s):  
Michele Margarone ◽  
Stefano Magi ◽  
Giuseppe Gorla ◽  
Stefano Biffi ◽  
Paolo Siboni ◽  
...  

Surface oil and gas treatment facilities in service for decades are likely to be oversized due to the natural depletion of their reservoirs. Despite these plants might have been designed modularly, meaning they comprise multiple identical units serving the same task, such units operate often in conditions far from the design. This work analyzes the revamping options of an existing upstream gas facility, chosen because representative of a wide set of plants. It presents a flexible process simulation model, implemented in the HYSYS environment and dynamically linked to an Excel spreadsheet, which includes the performance maps of all turbomachineries and the main characteristics of the investigated modifications. The model may be used to run simulations for various gas input conditions and to predict the performance over 1 year of operation and for different possible future scenarios. The first objective is to assess economically the considered options, which shall be applied only if yielding short return times of the investment since the reservoir is mature. Moreover, all options are appreciated adopting a figure of merit, here defined, that compares the overall energy consumption to the one calculated with state-of-the-art technologies. In addition, exergy and environmental analyses are executed.

Author(s):  
Michele Margarone ◽  
Stefano Magi ◽  
Giuseppe Gorla ◽  
Stefano Biffi ◽  
Paolo Siboni ◽  
...  

Surface oil and gas treatment facilities in service for decades are likely to be oversized due to the natural depletion of their reservoirs. Despite these plants might have been designed modularly, meaning they comprise multiple identical units serving the same task, such units operate often in conditions far from the design point and inefficiently. This work analyzes the revamping options of an existing upstream gas facility, which is chosen because representative of a wide set of plants. A flexible numerical model, implemented in the HYSYS environment and dynamically linked to an Excel spreadsheet, includes the performance maps of all turbo machineries and the main characteristics of the investigated modifications in order to run simulation for many gas input conditions and to predict the performance over a year of operation and for different possible future scenarios. The first objective is to assess economically the considered options, which shall be applied only if yielding short return times of the investment since the reservoir is mature. Moreover, all options are appreciated adopting a figure of merit, here defined, that compares the overall energy consumption to that calculated with state-of-the-art technologies. In addition, an exergy and an environmental analyses are executed.


2009 ◽  
pp. 18-31
Author(s):  
G. Rapoport ◽  
A. Guerts

In the article the global crisis of 2008-2009 is considered as superposition of a few regional crises that occurred simultaneously but for different reasons. However, they have something in common: developed countries tend to maintain a strong level of social security without increasing the real production output. On the one hand, this policy has resulted in trade deficit and partial destruction of market mechanisms. On the other hand, it has clashed with the desire of several oil and gas exporting countries to receive an exclusive price for their energy resources.


2021 ◽  
Vol 16 (1) ◽  
Author(s):  
Jens Zentgraf ◽  
Sven Rahmann

Abstract Motivation With an increasing number of patient-derived xenograft (PDX) models being created and subsequently sequenced to study tumor heterogeneity and to guide therapy decisions, there is a similarly increasing need for methods to separate reads originating from the graft (human) tumor and reads originating from the host species’ (mouse) surrounding tissue. Two kinds of methods are in use: On the one hand, alignment-based tools require that reads are mapped and aligned (by an external mapper/aligner) to the host and graft genomes separately first; the tool itself then processes the resulting alignments and quality metrics (typically BAM files) to assign each read or read pair. On the other hand, alignment-free tools work directly on the raw read data (typically FASTQ files). Recent studies compare different approaches and tools, with varying results. Results We show that alignment-free methods for xenograft sorting are superior concerning CPU time usage and equivalent in accuracy. We improve upon the state of the art sorting by presenting a fast lightweight approach based on three-way bucketed quotiented Cuckoo hashing. Our hash table requires memory comparable to an FM index typically used for read alignment and less than other alignment-free approaches. It allows extremely fast lookups and uses less CPU time than other alignment-free methods and alignment-based methods at similar accuracy. Several engineering steps (e.g., shortcuts for unsuccessful lookups, software prefetching) improve the performance even further. Availability Our software xengsort is available under the MIT license at http://gitlab.com/genomeinformatics/xengsort. It is written in numba-compiled Python and comes with sample Snakemake workflows for hash table construction and dataset processing.


2021 ◽  
Vol 52 (1) ◽  
Author(s):  
Fabienne Archer ◽  
Alexandra Bobet-Erny ◽  
Maryline Gomes

AbstractThe number and severity of diseases affecting lung development and adult respiratory function have stimulated great interest in developing new in vitro models to study lung in different species. Recent breakthroughs in 3-dimensional (3D) organoid cultures have led to new physiological in vitro models that better mimic the lung than conventional 2D cultures. Lung organoids simulate multiple aspects of the real organ, making them promising and useful models for studying organ development, function and disease (infection, cancer, genetic disease). Due to their dynamics in culture, they can serve as a sustainable source of functional cells (biobanking) and be manipulated genetically. Given the differences between species regarding developmental kinetics, the maturation of the lung at birth, the distribution of the different cell populations along the respiratory tract and species barriers for infectious diseases, there is a need for species-specific lung models capable of mimicking mammal lungs as they are of great interest for animal health and production, following the One Health approach. This paper reviews the latest developments in the growing field of lung organoids.


Database ◽  
2021 ◽  
Vol 2021 ◽  
Author(s):  
Yifan Shao ◽  
Haoru Li ◽  
Jinghang Gu ◽  
Longhua Qian ◽  
Guodong Zhou

Abstract Extraction of causal relations between biomedical entities in the form of Biological Expression Language (BEL) poses a new challenge to the community of biomedical text mining due to the complexity of BEL statements. We propose a simplified form of BEL statements [Simplified Biological Expression Language (SBEL)] to facilitate BEL extraction and employ BERT (Bidirectional Encoder Representation from Transformers) to improve the performance of causal relation extraction (RE). On the one hand, BEL statement extraction is transformed into the extraction of an intermediate form—SBEL statement, which is then further decomposed into two subtasks: entity RE and entity function detection. On the other hand, we use a powerful pretrained BERT model to both extract entity relations and detect entity functions, aiming to improve the performance of two subtasks. Entity relations and functions are then combined into SBEL statements and finally merged into BEL statements. Experimental results on the BioCreative-V Track 4 corpus demonstrate that our method achieves the state-of-the-art performance in BEL statement extraction with F1 scores of 54.8% in Stage 2 evaluation and of 30.1% in Stage 1 evaluation, respectively. Database URL: https://github.com/grapeff/SBEL_datasets


2019 ◽  
Vol 36 (2) ◽  
pp. 60-69
Author(s):  
Paul H Cleverley ◽  
Simon Burnett

Enterprise search is changing. The explosion of information within organizations, technological advances and availability of free OpenSource machine learning libraries offer many possibilities. Eighteen informants from practice, academia, search technology vendors and large organizations (Oil and Gas, Governments, Pharmaceuticals, Aerospace and Retail) were interviewed to assess challenges and future directions. The findings confirmed the existence of the ‘Google Habitus’, technology propaganda and a need to transcend disciplines for a Systems thinking approach toward enterprise search. This encompasses information management, user search literacy, governance, learning feedback loops as well as technology. A novel four-level model for enterprise search use cases is presented, covering search as a utility, search as an answer machine, search task apps and a discovery engine. This could be used to reframe enterprise search perceptions, expanding possibilities and improving business outcomes.


1998 ◽  
Vol 08 (01) ◽  
pp. 21-66 ◽  
Author(s):  
W. M. P. VAN DER AALST

Workflow management promises a new solution to an age-old problem: controlling, monitoring, optimizing and supporting business processes. What is new about workflow management is the explicit representation of the business process logic which allows for computerized support. This paper discusses the use of Petri nets in the context of workflow management. Petri nets are an established tool for modeling and analyzing processes. On the one hand, Petri nets can be used as a design language for the specification of complex workflows. On the other hand, Petri net theory provides for powerful analysis techniques which can be used to verify the correctness of workflow procedures. This paper introduces workflow management as an application domain for Petri nets, presents state-of-the-art results with respect to the verification of workflows, and highlights some Petri-net-based workflow tools.


2021 ◽  
Vol 7 (4) ◽  
pp. 1-24
Author(s):  
Douglas Do Couto Teixeira ◽  
Aline Carneiro Viana ◽  
Jussara M. Almeida ◽  
Mrio S. Alvim

Predicting mobility-related behavior is an important yet challenging task. On the one hand, factors such as one’s routine or preferences for a few favorite locations may help in predicting their mobility. On the other hand, several contextual factors, such as variations in individual preferences, weather, traffic, or even a person’s social contacts, can affect mobility patterns and make its modeling significantly more challenging. A fundamental approach to study mobility-related behavior is to assess how predictable such behavior is, deriving theoretical limits on the accuracy that a prediction model can achieve given a specific dataset. This approach focuses on the inherent nature and fundamental patterns of human behavior captured in that dataset, filtering out factors that depend on the specificities of the prediction method adopted. However, the current state-of-the-art method to estimate predictability in human mobility suffers from two major limitations: low interpretability and hardness to incorporate external factors that are known to help mobility prediction (i.e., contextual information). In this article, we revisit this state-of-the-art method, aiming at tackling these limitations. Specifically, we conduct a thorough analysis of how this widely used method works by looking into two different metrics that are easier to understand and, at the same time, capture reasonably well the effects of the original technique. We evaluate these metrics in the context of two different mobility prediction tasks, notably, next cell and next distinct cell prediction, which have different degrees of difficulty. Additionally, we propose alternative strategies to incorporate different types of contextual information into the existing technique. Our evaluation of these strategies offer quantitative measures of the impact of adding context to the predictability estimate, revealing the challenges associated with doing so in practical scenarios.


2021 ◽  
Vol 30 (5) ◽  
pp. 58-65
Author(s):  
A. Yu. Shebeko ◽  
Yu. N. Shebeko ◽  
A. V. Zuban

Introduction. GOST R 12.3.047-2012 standard offers a methodology for determination of required fire resistance limits of engineering structures. This methodology is based on a comparison of values of the fire resistance limit and the equivalent fire duration. However, in practice incidents occur when, in absence of regulatory fire resistance requirements, a facility owner, who has relaxed the fire resistance requirements prescribed by GOST R 12.3.047–2012, is ready to accept its potential loss in fire for economic reasons. In this case, one can apply the probability of safe evacuation and rescue to compare distributions of fire resistance limits, on the one hand, and evacuation and rescue time, on the other hand.A methodology for the identification of required fire resistance limits. The probabilistic method for the identification of required fire resistance limits, published in work [1], was tested in this study. This method differs from the one specified in GOST R 12.3.047-2012. The method is based on a comparison of distributions of such random values, as the estimated time of evacuation or rescue in case of fire at a production facility and fire resistance limits for engineering structures.Calculations of required fire resistance limits. This article presents a case of application of the proposed method to the rescue of people using the results of full-scale experiments, involving a real pipe rack at a gas processing plant [2].Conclusions. The required fire resistance limits for pipe rack structures of a gas processing plant were identified. The calculations took account of the time needed to evacuate and rescue the personnel, as well as the pre-set reliability of structures, given that the personnel evacuation and rescue time in case of fire is identified in an experiment.


Sign in / Sign up

Export Citation Format

Share Document