scholarly journals Fast and Versatile Chromatography Process Design and Operation Optimization with the Aid of Artificial Intelligence

Processes ◽  
2021 ◽  
Vol 9 (12) ◽  
pp. 2121
Author(s):  
Mourad Mouellef ◽  
Florian Lukas Vetter ◽  
Steffen Zobel-Roos ◽  
Jochen Strube

Preparative and process chromatography is a versatile unit operation for the capture, purification, and polishing of a broad variety of molecules, especially very similar and complex compounds such as sugars, isomers, enantiomers, diastereomers, plant extracts, and metal ions such as rare earth elements. Another steadily growing field of application is biochromatography, with a diversity of complex compounds such as peptides, proteins, mAbs, fragments, VLPs, and even mRNA vaccines. Aside from molecular diversity, separation mechanisms range from selective affinity ligands, hydrophobic interaction, ion exchange, and mixed modes. Biochromatography is utilized on a scale of a few kilograms to 100,000 tons annually at about 20 to 250 cm in column diameter. Hence, a versatile and fast tool is needed for process design as well as operation optimization and process control. Existing process modeling approaches have the obstacle of sophisticated laboratory scale experimental setups for model parameter determination and model validation. For a broader application in daily project work, the approach has to be faster and require less effort for non-chromatography experts. Through the extensive advances in the field of artificial intelligence, new methods have emerged to address this need. This paper proposes an artificial neural network-based approach which enables the identification of competitive Langmuir-isotherm parameters of arbitrary three-component mixtures on a previously specified column. This is realized by training an ANN with simulated chromatograms varying in isotherm parameters. In contrast to traditional parameter estimation techniques, the estimation time is reduced to milliseconds, and the need for expert or prior knowledge to obtain feasible estimates is reduced.

Entropy ◽  
2020 ◽  
Vol 23 (1) ◽  
pp. 18
Author(s):  
Pantelis Linardatos ◽  
Vasilis Papastefanopoulos ◽  
Sotiris Kotsiantis

Recent advances in artificial intelligence (AI) have led to its widespread industrial adoption, with machine learning systems demonstrating superhuman performance in a significant number of tasks. However, this surge in performance, has often been achieved through increased model complexity, turning such systems into “black box” approaches and causing uncertainty regarding the way they operate and, ultimately, the way that they come to decisions. This ambiguity has made it problematic for machine learning systems to be adopted in sensitive yet critical domains, where their value could be immense, such as healthcare. As a result, scientific interest in the field of Explainable Artificial Intelligence (XAI), a field that is concerned with the development of new methods that explain and interpret machine learning models, has been tremendously reignited over recent years. This study focuses on machine learning interpretability methods; more specifically, a literature review and taxonomy of these methods are presented, as well as links to their programming implementations, in the hope that this survey would serve as a reference point for both theorists and practitioners.


Geophysics ◽  
1999 ◽  
Vol 64 (6) ◽  
pp. 1730-1734 ◽  
Author(s):  
Beatriz Martín‐Atienza ◽  
Juan García‐Abdeslem

New methods for 2-D modeling of gravity anomaly data are developed following an approach that uses both analytic and numerical methods of integration. The forward‐model solution developed here is suitable to calculate the gravity effect caused by a 2-D source body bounded either laterally or vertically by continuous functions. In our models, the density contrast is defined by a second‐order polynomial function of depth and distance along the profile. We present several examples to show that our models are capable of accommodating a broad variety of geologic structures.


Mathematics ◽  
2021 ◽  
Vol 9 (17) ◽  
pp. 2048
Author(s):  
Ileana Ruxandra Badea ◽  
Carmen Elena Mocanu ◽  
Florin F. Nichita ◽  
Ovidiu Păsărescu

The purpose of this paper is to promote new methods in mathematical modeling inspired by neuroscience—that is consciousness and subconsciousness—with an eye toward artificial intelligence as parts of the global brain. As a mathematical model, we propose topoi and their non-standard enlargements as models, due to the fact that their logic corresponds well to human thinking. For this reason, we built non-standard analysis in a special class of topoi; before now, this existed only in the topos of sets (A. Robinson). Then, we arrive at the pseudo-particles from the title and to a new axiomatics denoted by Intuitionistic Internal Set Theory (IIST); a class of models for it is provided, namely, non-standard enlargements of the previous topoi. We also consider the genetic–epigenetic interplay with a mathematical introduction consisting of a study of the Yang–Baxter equations with new mathematical results.


2020 ◽  
pp. 799-810
Author(s):  
Matthew Nagy ◽  
Nathan Radakovich ◽  
Aziz Nazha

The volume and complexity of scientific and clinical data in oncology have grown markedly over recent years, including but not limited to the realms of electronic health data, radiographic and histologic data, and genomics. This growth holds promise for a deeper understanding of malignancy and, accordingly, more personalized and effective oncologic care. Such goals require, however, the development of new methods to fully make use of the wealth of available data. Improvements in computer processing power and algorithm development have positioned machine learning, a branch of artificial intelligence, to play a prominent role in oncology research and practice. This review provides an overview of the basics of machine learning and highlights current progress and challenges in applying this technology to cancer diagnosis, prognosis, and treatment recommendations, including a discussion of current takeaways for clinicians.


Cyber security is a constantly evolving area of interest. Many solutions are currently available and new methods and technologies are emerging. Although some solutions already exist in extended reality, a lack of engagement and storytelling is available, with a consequence of decreasing the probability of dissemination and awareness of the risks involved in cybersecurity. This chapter gives an overview of an extended reality platform that can be potentially used for the simulation of security threats and that combines artificial intelligence and game design principles. The main goal of this research is to develop an extended reality solution to simulate a story involving virtual characters and objects for the entertainment industry, with possible applications in other sectors such as education and training. After an introduction to extended reality, the chapter focuses on an overview on the available extended reality technologies in the context of cybersecurity.


Data ◽  
2021 ◽  
Vol 6 (4) ◽  
pp. 42
Author(s):  
Diana Kafkes ◽  
Jason St. John

The Booster Operation Optimization Sequential Time-series for Regression (BOOSTR) dataset was created to provide a cycle-by-cycle time series of readings and settings from instruments and controllable devices of the Booster, Fermilab’s Rapid-Cycling Synchrotron (RCS) operating at 15 Hz. BOOSTR provides a time series from 55 device readings and settings that pertain most directly to the high-precision regulation of the Booster’s gradient magnet power supply (GMPS). To our knowledge, this is one of the first well-documented datasets of accelerator device parameters made publicly available. We are releasing it in the hopes that it can be used to demonstrate aspects of artificial intelligence for advanced control systems, such as reinforcement learning and autonomous anomaly detection.


2007 ◽  
Vol 539-543 ◽  
pp. 3130-3135
Author(s):  
R.H. Wu ◽  
K.C. Pang

The deformation features are analyzed for titanium alloy and superalloy during isothermal/hot die forging process, and proper finite element models with appropriate parameter values are determined. On the platform of DEFORM software, the formation processes of vane-integrated disk and compressor disk, made of titanium alloy and superalloy respectively, are simulated and analyzed. Based on the simulation results, some important suggestions to the process design and parameter determination are brought forward, which have been taken into consideration or adoption in practice. As a result, the production yield is promoted, and a large amount of expenses of testing and die trial-manufacture are saved.


2019 ◽  
Vol 5 (11) ◽  
pp. 176-196 ◽  
Author(s):  
N. Romanchuk ◽  
P. Romanchuk

Doctor and neurophysiologist: a modern solution to problems of rehabilitation ‘cognitive brain’ of Homo sapiens using on the one hand, tools and technologies of artificial intelligence, and with another — a multidisciplinary collaboration with clinical neurophysiologist ‘universal’ specialist in the field of neurology, psychiatry, psychotherapy, psychoanalysis and geriatrics. Modern artificial intelligence technologies are capable of many things, including predicting Alzheimer’s disease with the help of combined and hybrid neuroimaging, sequencing of a new generation, etc., in order to start timely and effective rehabilitation brain H. sapiens. The H. sapiens brain is the next frontier for health care. Through the fusion of combined and hybrid neuroimaging techniques with artificial intelligence technologies, it will be possible to understand and diagnose neurological disorders and find new methods of rehabilitation and medical and social support that will lead to improved mental health. To restore circadian neuroplasticity of the brain, a multimodal scheme is proposed: circadian glasses, functional nutrition and physical activity. A combined and hybrid cluster in the diagnosis, treatment, prevention and rehabilitation of cognitive disorders and cognitive disorders has been developed and implemented.


2019 ◽  
Vol 141 ◽  
pp. 229-271 ◽  
Author(s):  
Vincent Gerbaud ◽  
Ivonne Rodriguez-Donis ◽  
Laszlo Hegely ◽  
Peter Lang ◽  
Ferenc Denes ◽  
...  

2005 ◽  
Vol 52 (10-11) ◽  
pp. 461-468 ◽  
Author(s):  
R.M. Jones ◽  
C.M. Bye ◽  
P.L. Dold

Nitrification kinetics are important for process design, optimization, and capacity rating of activated sludge wastewater treatment plants. A Water Environment Research Foundation (WERF) project on Methods for Wastewater Characterization in Activated Sludge Modeling (WERF, 2003) focused significantly on the development of procedures for measuring the nitrifier maximum specific growth rate, μAUT. In addition, the importance of (and lack of data for) the nitrifier decay rate, bAUT, was identified. This paper describes three bench-scale methods for measuring μAUT: the Low F/M SBR, Washout and High F/M methods. During the WERF project, the importance of pH and temperature control was investigated briefly; this paper summarizes further experimental work performed to address these issues. A summary of μAUT measurements in a number of locations and using the different measurement techniques is provided.


Sign in / Sign up

Export Citation Format

Share Document