scholarly journals Modeling and Algorithmic Approaches to Constitutively-Complex, Microstructured Fluids

2014 ◽  
Author(s):  
Gregory H. Miller ◽  
Gregory Forest
Author(s):  
Yeshayahu Talmon

To achieve complete microstructural characterization of self-aggregating systems, one needs direct images in addition to quantitative information from non-imaging, e.g., scattering or Theological measurements, techniques. Cryo-TEM enables us to image fluid microstructures at better than one nanometer resolution, with minimal specimen preparation artifacts. Direct images are used to determine the “building blocks” of the fluid microstructure; these are used to build reliable physical models with which quantitative information from techniques such as small-angle x-ray or neutron scattering can be analyzed.To prepare vitrified specimens of microstructured fluids, we have developed the Controlled Environment Vitrification System (CEVS), that enables us to prepare samples under controlled temperature and humidity conditions, thus minimizing microstructural rearrangement due to volatile evaporation or temperature changes. The CEVS may be used to trigger on-the-grid processes to induce formation of new phases, or to study intermediate, transient structures during change of phase (“time-resolved cryo-TEM”). Recently we have developed a new CEVS, where temperature and humidity are controlled by continuous flow of a mixture of humidified and dry air streams.


Author(s):  
Jordan Musser ◽  
Ann S Almgren ◽  
William D Fullmer ◽  
Oscar Antepara ◽  
John B Bell ◽  
...  

MFIX-Exa is a computational fluid dynamics–discrete element model (CFD-DEM) code designed to run efficiently on current and next-generation supercomputing architectures. MFIX-Exa combines the CFD-DEM expertise embodied in the MFIX code—which was developed at NETL and is used widely in academia and industry—with the modern software framework, AMReX, developed at LBNL. The fundamental physics models follow those of the original MFIX, but the combination of new algorithmic approaches and a new software infrastructure will enable MFIX-Exa to leverage future exascale machines to optimize the modeling and design of multiphase chemical reactors.


2008 ◽  
Vol 36 (1) ◽  
pp. 467-468 ◽  
Author(s):  
Chad R. Meiners ◽  
Alex X. Liu ◽  
Eric Torng

2013 ◽  
Vol 23 (1) ◽  
pp. 3-17 ◽  
Author(s):  
Angelo Sifaleras

We present a wide range of problems concerning minimum cost network flows, and give an overview of the classic linear single-commodity Minimum Cost Network Flow Problem (MCNFP) and some other closely related problems, either tractable or intractable. We also discuss state-of-the-art algorithmic approaches and recent advances in the solution methods for the MCNFP. Finally, optimization software packages for the MCNFP are presented.


2019 ◽  
Vol 2 ◽  
pp. 1-8
Author(s):  
Stefan Fuest ◽  
Monika Sester

<p><strong>Abstract.</strong> Due to an increasing traffic density in urban environments, the traffic management as well as the society needs to face various problems including congestion, air pollution or a higher probability of accidents. Therefore, it is getting more important to make road users aware of efficient route alternatives to obtain a better distribution of the traffic flow. Since the time for making route decisions is usually limited, the visualization of the information, which should be conveyed, needs to be prepared in a very clear and easily understandable format. In this approach, we propose a framework for automatically visualizing route efficiency based on various environmentally relevant scenarios. Our methods used to create the map visualizations are based on human perception of space, in order to communicate routes and traffic-related situations more intuitively. That is, humans are assumed to mentally abstract the geographic space using various types of distortions rather than perceiving the environment in its actual shape. Based on these concepts, we argue that a perception-based representation of the route, as well as the visual communication of temporary disturbances may not only simplify the navigation process, but also supports an awareness for the current traffic dynamics, which thus may influence route choice behavior towards a more altruistic behavior. In this paper, we further present two algorithmic approaches for automatically abstracting the geometry of a route in more detail, using cartographic generalization techniques - to present the road network in a way, how it might be perceived based on a certain traffic situation.</p>


2022 ◽  
Vol 13 (1) ◽  
pp. 1-21
Author(s):  
Hui Luo ◽  
Zhifeng Bao ◽  
Gao Cong ◽  
J. Shane Culpepper ◽  
Nguyen Lu Dang Khoa

Traffic bottlenecks are a set of road segments that have an unacceptable level of traffic caused by a poor balance between road capacity and traffic volume. A huge volume of trajectory data which captures realtime traffic conditions in road networks provides promising new opportunities to identify the traffic bottlenecks. In this paper, we define this problem as trajectory-driven traffic bottleneck identification : Given a road network R , a trajectory database T , find a representative set of seed edges of size K of traffic bottlenecks that influence the highest number of road segments not in the seed set. We show that this problem is NP-hard and propose a framework to find the traffic bottlenecks as follows. First, a traffic spread model is defined which represents changes in traffic volume for each road segment over time. Then, the traffic diffusion probability between two connected segments and the residual ratio of traffic volume for each segment can be computed using historical trajectory data. We then propose two different algorithmic approaches to solve the problem. The first one is a best-first algorithm BF , with an approximation ratio of 1-1/ e . To further accelerate the identification process in larger datasets, we also propose a sampling-based greedy algorithm SG . Finally, comprehensive experiments using three different datasets compare and contrast various solutions, and provide insights into important efficiency and effectiveness trade-offs among the respective methods.


1997 ◽  
Vol 06 (02) ◽  
pp. 95-149 ◽  
Author(s):  
Parke Godfrey

When a query fails, it is more cooperative to identify the cause of failure, rather than just to report the empty answer set. When there is not a cause per se for the query's failure, it is then worthwhile to report the part of the query which failed. To identify a Minimal Failing Subquery (MFS) of the query is the best way to do this. (This MFS is not unique; there may be many of them.) Likewise, to identify a Maximal Succeeding Subquery (XSS) can help a user to recast a new query that leads to a non-empty answer set. Database systems do not provide the functionality of these types of cooperative responses. This may be, in part, because algorithmic approaches to finding the MFSs and the XSSs to a failing query are not obvious. The search space of subqueries is large. Despite work on MFSs in the past, the algorithmic complexity of these identification problems had remained uncharted. This paper shows the complexity profile of MFS and XSS identification. It is shown that there exists a simple algorithm for finding an MFS or an XSS by asking N subsequent queries, in which N is the length of the query. To find more MFSs (or XSSs) can be hard. It is shown that to find N MFSs (or XSSs) is NP-hard. To find k MFSs (or XSSs), for a fixed k, remains polynomial. An optimal algorithm for enumerating MFSs and XSSs, ISHMAEL, is developed and presented. The algorithm has ideal performance in enumeration, finding the first answers quickly, and only decaying toward intractability in a predictable manner as further answers are found. The complexity results and the algorithmic approaches given in this paper should allow for the construction of cooperative facilities which identify MFSs and XSSs for database systems. These results are relevant to a number of problems outside of databases too, and may find further application.


2018 ◽  
Vol 81 (4) ◽  
pp. 297-304 ◽  
Author(s):  
Chia-Hung Wu ◽  
Yukun Luo ◽  
Xiang Fei ◽  
Yi-Hong Chou ◽  
Hong-Jen Chiou ◽  
...  

Author(s):  
Patrick Chedmail ◽  
Christophe Le Roy

Abstract The validation of the accessibility, maintainability, mounting/dismantle simulation in a cluttered environment is a key problem during the design process of a mechanical system. On the one hand research in path planning lead to automatic trajectory definition. These systems are really efficient for simple problems. On the other hand direct manipulation is possible thanks to robotic CAD systems. Another direct manipulation is possible with common virtual reality tools that allow the designer immersion in a whole mechanical environment. In such an environment the designer can handle an object in order to check its accessibility. Thanks to the use of a multi-agent architecture we greatly improve the effectiveness of virtual reality tools while coupling algorithmic approaches and direct manipulation. This original method is a solution of a multi-criteria constrained optimisation problem. Theoretical and practical aspects are presented.


Sign in / Sign up

Export Citation Format

Share Document