Finite Element Method (FEM) Based Simulation of Continuous Laser Ablation: Surface Temperature and Depth Profile Evolution

Author(s):  
Sandeep Ravi-Kumar ◽  
Liangkui Jiang ◽  
Hantang Qin

Abstract Laser ablation has been widely used for material removal on different types of substrates. Accurate feature profile fabrication with minimum damage to the surrounding material requires precise control of the laser and material parameters. One approach to achieve this is by establishing a simulation model to help process control and optimization. However, laser ablation is a complex process that is difficult to model. In this paper, numerical simulation models have been established to identify the temperature at the ablation surface and the ablation depth profile evolution over time. The ablation has been modeled using the heat transfer in solids module in COMSOL Multiphysics with the manual material definition of high-density polyethylene (HDPE). The laser beam is modeled as a continuous heat source by utilizing a ramp function. Information for establishing a pulsed laser system has been provided. Results are provided for the surface temperature and depth profile evolution for various time steps. Results of the simulation of laser ablation of HDPE sample using a 50W laser using both the models were presented. The next step of our work is to validate the simulation results by comparing it against experimental data. This will render these models to have the potential to be able to predict the ablation crater profile with higher accuracy. This model will pave the way for a better understanding of the ablation threshold conditions and identifying the ablation initiation in any material, given the material properties are known.

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Anthony D. Fouad ◽  
Alice Liu ◽  
Angelica Du ◽  
Priya D. Bhirgoo ◽  
Christopher Fang-Yen

AbstractLaser microsurgery has long been an important means of assessing the functions of specific cells and tissues. Most laser ablation systems use short, highly focused laser pulses to create plasma-mediated lesions with dimensions on the order of the wavelength of light. While the small size of the lesion enables ablation with high spatial resolution, it also makes it difficult to ablate larger structures. We developed an infrared laser ablation system capable of thermally lesioning tissues with spot sizes tunable by the duration and amplitude of laser pulses. We used our laser system in the roundworm C. elegans to kill single neurons and to sever the dorsal and ventral nerve cords, structures that are difficult to lesion using a plasma-based ablation system. We used these ablations to investigate the source of convulsions in a gain-of-function mutant for the acetylcholine receptor ACR-2. Severing the ventral nerve cord caused convulsions to occur independently anterior and posterior to the lesion, suggesting that convulsions can arise independently from distinct subsets of the motor circuit.


2005 ◽  
Vol 20 (2) ◽  
pp. 117-125 ◽  
Author(s):  
MICHAEL LUCK ◽  
EMANUELA MERELLI

The scope of the Technical Forum Group (TFG) on Agents in Bioinformatics (BIOAGENTS) was to inspire collaboration between the agent and bioinformatics communities with the aim of creating an opportunity to propose a different (agent-based) approach to the development of computational frameworks both for data analysis in bioinformatics and for system modelling in computational biology. During the day, the participants examined the future of research on agents in bioinformatics primarily through 12 invited talks selected to cover the most relevant topics. From the discussions, it became clear that there are many perspectives to the field, ranging from bio-conceptual languages for agent-based simulation, to the definition of bio-ontology-based declarative languages for use by information agents, and to the use of Grid agents, each of which requires further exploration. The interactions between participants encouraged the development of applications that describe a way of creating agent-based simulation models of biological systems, starting from an hypothesis and inferring new knowledge (or relations) by mining and analysing the huge amount of public biological data. In this report we summarize and reflect on the presentations and discussions.


2019 ◽  
Vol 10 (1) ◽  
Author(s):  
Roger D Magarey ◽  
Thomas M Chappell ◽  
Christina M Trexler ◽  
Godshen R Pallipparambil ◽  
Ernie F Hain

Abstract Integrated pest management (IPM) is a valuable tool for reducing pesticide use and for pesticide resistance management. Despite the success of IPM over the last 50 yr, significant challenges remain to improving IPM delivery and adoption. We believe that insights can be obtained from the field of Social Ecological Systems (SES). We first describe the complexity of crop pest management and how various social actors influence grower decision making, including adoption of IPM. Second, we discuss how crop pest management fits the definition of an SES, including such factors as scale, dynamic complexities, critical resources, and important social–ecological interactions. Third, we describe heuristics and simulation models as tools to understand complex SES and develop new strategies. Finally, we conclude with a brief discussion of how social processes and SES techniques could improve crop pest management in the future, including the delivery of IPM, while reducing negative social and environmental impacts.


2012 ◽  
Vol 51 (8) ◽  
pp. 1519-1530 ◽  
Author(s):  
Iryna Tereshchenko ◽  
Alexander N. Zolotokrylin ◽  
Tatiana B. Titkova ◽  
Luis Brito-Castillo ◽  
Cesar Octavio Monzon

AbstractThe authors explore a new approach to monitoring of desertification that is based on use of results on the relation between albedo and surface temperature for the Sonoran Desert in northwestern Mexico. The criteria of predominance of radiation by using the threshold value of Advanced Very High Resolution Radiometer (AVHRR) and Moderate Resolution Imaging Spectroradiometer (MODIS) normalized difference vegetation index (NDVI) were determined. The radiation mechanism for regulating the temperature of the surface and the definition of threshold values for AVHRR and MODIS NDVI have an objective justification for the energy budget, which is based on the dominance of radiation surface temperature regulation in relation to evapotranspiration. Changes in the extent of arid regions with AVHRR NDVI of <0.08 and MODIS NDVI of <0.10 can be considered to be a characteristic in the evolution of desertification in the Sonoran Desert region. This is true because, in a certain year, the time span of the period when radiation factor predominates is important for the desertification process.


1994 ◽  
Vol 34 (1) ◽  
pp. 513
Author(s):  
P.V.Hinton P.V.Hinton ◽  
M.G.Cousins ◽  
P.E.Symes

The central fields area of the Gippsland Basin, Australia, includes the Halibut, Cobia, Fortescue, and Mackerel oil fields. These large fields are mature with about 80% of the reserves produced. During 1991 and 1992 a multidisciplinary study, integrating the latest technology, was completed to help optimise the depletion of the remaining significant reserves.A grid of 4500 km of high resolution 3D seismic data covering 191 square kilometres allowed the identification of subtle structural traps as well as better definition of sandstone truncation edges which represent the ultimate drainage points. In addition, the latest techniques in seismic attribute analysis provided insight into depositional environments, seal potential and facies distribution. Sequence stratigraphic concepts were used in combination with seismic data to build complex multi million cell 3D geological models. Reservoir simulation models were then constructed to history match past production and to predict future field performance. Facility studies were also undertaken to optimise depletion strategies.The Central Fields Depletion Study has resulted in recommendations to further develop the fields with about 80 work-overs, 50 infill wells, reduction in separator pressures, and gas lift and water handling facility upgrades. These activities are expected to increase ultimate reserves and production. Some of the recommendations have been implemented with initial results of additional drilling on Mackerel increasing platform production from 22,000 BOPD to over 50,000 BOPD. An ongoing program of additional drilling from the four platforms is expected to continue for several years.


Author(s):  
Jaehyun Kim ◽  
David Wallace

Numerous collaborative design tools have been developed to accelerate the product development, and recently environments for building distributed simulations have been proposed. For example, a simulation framework called DOME (Distributed Object-oriented Modeling and Evaluation) has been developed in MIT CADLAB. DOME is unique in its decentralized structure that allows heterogeneous simulations to be stitched together while allowing proprietary information an simulation models to remain secure with each participant. While such an approach offers many advantages, it also hides causality and sensitivity information, making it difficult for designers to understand problem structure and verify solutions. The purpose of this research is to analyze the relationships between design parameters (causality) and the strength of the relationships (sensitivity) in decentralized web-based design simulation. Algorithms and implementations for the causality and sensitivity analysis are introduced. Causality is determined using Granger’s definition of causality, which is to distinguish causation from association using conditional variance of the suspected output variable. Sensitivity is estimated by linear regression analysis and a perturbation method, which transfers the problem into a frequency domain by generating periodic perturbations. Varying Internet latency and disturbances are problematic issues with these methods. Thus, new algorithms are developed and tested to overcome these problems.


Author(s):  
Paulo Camargo Silva ◽  
Virgílio José Martins Ferreira Filho

In the recent literature of the production history matching the problem of non-uniqueness of reservoir simulation models has been considered a difficult problem. Complex workflows have been proposed to solve the problem. However, the reduction of uncertainty can only be done with the definition of Probability Density Functions that are highly costly. In this article we introduce a methodology to reduce uncertainty in the history matching using techniques of Monte Carlo performed on proxies as Reservoir Simulator. This methodology is able to compare different Probability Density Functions for different reservoir simulation models to define among the models which simulation model can provide more appropriate matching.


2019 ◽  
Vol 10 (1) ◽  
Author(s):  
Aurélien Mounier ◽  
Marta Mirazón Lahr

Abstract The origin of Homo sapiens remains a matter of debate. The extent and geographic patterning of morphological diversity among Late Middle Pleistocene (LMP) African hominins is largely unknown, thus precluding the definition of boundaries of variability in early H. sapiens and the interpretation of individual fossils. Here we use a phylogenetic modelling method to predict possible morphologies of a last common ancestor of all modern humans, which we compare to LMP African fossils (KNM-ES 11693, Florisbad, Irhoud 1, Omo II, and LH18). Our results support a complex process for the evolution of H. sapiens, with the recognition of different, geographically localised, populations and lineages in Africa – not all of which contributed to our species’ origin. Based on the available fossils, H. sapiens appears to have originated from the coalescence of South and, possibly, East-African source populations, while North-African fossils may represent a population which introgressed into Neandertals during the LMP.


Sign in / Sign up

Export Citation Format

Share Document