scholarly journals Multi-Component Separation and Purification of Natural Gas

Author(s):  
David Tamburello ◽  
Bruce Hardy ◽  
Martin Sulic

Over the past decade, several technical developments (such as hydraulic fracturing) have led to an exponential increase in discovering new domestic natural gas reserves. Raw natural gas composition can vary substantially from source to source. Typically, methane accounts for 75% to 95% of the total gas, with the rest of the gas containing ethane, propane, butane, other higher hydrocarbons, and impurities, with the most common including H2O, CO2, N2, and H2S. All natural gas requires some treatment, if only to remove H2O; however, the composition of natural gas delivered to the commercial pipeline grids is tightly controlled. Sub-quality natural gas reserves, which are defined as fields containing more than 2% CO2, 4% N2, or 4 ppm H2S, make up nearly half of the world’s natural gas volume. The development of sub-quality, remote, and unconventional fields (i.e. landfill gas) can present new challenges to gas separation and purification methods. Adsorbent technologies, such as the use of activated carbons, zeolites, or metal-organic frameworks (MOFs), may hold the key to more efficient and economically viable separation methods. This work proposes to prove the applicability of the multi-component potential theory of adsorption (MPTA) to a real world natural gas adsorbent system to properly characterize the adsorbent’s selectivity for an individual gas component using only the single component isotherms. Thus, the real-world gas separation/purification application of a specific adsorbent for a given gas stream can be obtained simply and effectively without the need for large experimental efforts or costly system modifications until after an initial computational screening of perspective materials has been completed. While the current research effort will use natural gas, which is the world’s largest industrial gas separations application, to validate the MPTA, the tools gained through this effort can be applied to other gas separation effort.

2020 ◽  
Vol 49 (11) ◽  
pp. 3553-3561 ◽  
Author(s):  
Zhenzhen Jiang ◽  
Ying Zou ◽  
Tingting Xu ◽  
Lihui Fan ◽  
Ping Zhou ◽  
...  

A cage-based MOF displays excellent hydrolytic stability as well as promising potential for diverse gas separation applications.


2021 ◽  
Vol 86 ◽  
pp. 103740
Author(s):  
Maria S. Sergeeva ◽  
Nikita A. Mokhnachev ◽  
Dmitry N. Shablykin ◽  
Andrey V. Vorotyntsev ◽  
Dmitriy M. Zarubin ◽  
...  

2021 ◽  
Author(s):  
Charles Okafor ◽  
Patrick Verdin ◽  
Phill Hart

Abstract Downhole Natural Gas Separation Efficiency (NGSE) is flow regime dependent, and current analytical models in certain conditions lack accuracy. Downhole NGSE was investigated through 3D Computational Fluid Dynamics (CFD) transient simulations for pumping wells in the Churn flow regime. The Volume of Fluid (VOF) multiphase model was considered along with the k – ε turbulence model for most simulations. A mesh independence study was performed, and the final model results validated against experimental data, showing an average error of less than 6 %. Numerical simulation results showed that the steady state assumption used by current mathematical models for churn flow can be inaccurate. Several key parameters affecting the NGSE were identified, and suggestions for key improvements to the widely used mathematical formulations for viscous flow provided. Sensitivity studies were conducted on fluid/geometric parameters and operating conditions, to gain a better understanding of the influence of each parameter on NGSE. These are important results as they equip the ESP engineer with additional knowledge to maximise the NGSE from design stage to pumping operations.


Processes ◽  
2021 ◽  
Vol 9 (9) ◽  
pp. 1568
Author(s):  
Federico Galli ◽  
Jun-Jie Lai ◽  
Jacopo De Tommaso ◽  
Gianluca Pauletto ◽  
Gregory S. Patience

Methane is the second highest contributor to the greenhouse effect. Its global warming potential is 37 times that of CO2. Flaring-associated natural gas from remote oil reservoirs is currently the only economical alternative. Gas-to-liquid (GtL) technologies first convert natural gas into syngas, then it into liquids such as methanol, Fischer–Tropsch fuels or dimethyl ether. However, studies on the influence of feedstock composition are sparse, which also poses technical design challenges. Here, we examine the techno-economic analysis of a micro-refinery unit (MRU) that partially oxidizes methane-rich feedstocks and polymerizes the syngas formed via Fischer–Tropsch reaction. We consider three methane-containing waste gases: natural gas, biogas, and landfill gas. The FT fuel selling price is critical for the economy of the unit. A Monte Carlo simulation assesses the influence of the composition on the final product quantity as well as on the capital and operative expenses. The Aspen Plus simulation and Python calculate the net present value and payback time of the MRU for different price scenarios. The CO2 content in biogas and landfill gas limit the CO/H2 ratio to 1.3 and 0.9, respectively, which increases the olefins content of the final product. Compressors are the main source of capital cost while the labor cost represents 20–25% of the variable cost. An analysis of the impact of the plant dimension demonstrated that the higher number represents a favorable business model for this unit. A minimal production of 7,300,000 kg y−1 is required for MRU to have a positive net present value after 10 years when natural gas is the feedstock.


Author(s):  
Tru H. Cao

For modeling real-world problems and constructing intelligent systems, integration of different methodologies and techniques has been the quest and focus of significant interdisciplinary research effort. The advantages of such a hybrid system are that the strengths of its partners are combined and complementary to each other’s weakness. In particular, object orientation provides a hierarchical data abstraction scheme and a mechanism for information hiding and inheritance. However, the classical object-oriented data model cannot deal with uncertainty and imprecision pervasive in real world problems. Meanwhile, probability theory and fuzzy logic provide measures and rules for representing and reasoning with uncertainty and imprecision. That has led to intensive research and development of fuzzy and probabilistic object-oriented databases, as collectively reported in De Caluwe (1997), Ma (2005), and Marín & Vila (2007).


Sign in / Sign up

Export Citation Format

Share Document