An Analysis Technique of Evacuation Simulation Using an Array DBMS

2018 ◽  
Vol 13 (2) ◽  
pp. 338-346
Author(s):  
Yusuke Kawai ◽  
Jing Zhao ◽  
Kento Sugiura ◽  
Yoshiharu Ishikawa ◽  
Yukiko Wakita ◽  
...  

Today, large-scale simulations are thriving because of the increase of computating performance and storage capacity. Understanding the results of these simulations is not easy, and hence, support for interactive and exploratory analysis is becoming more important. This study focuses on spatio-temporal simulations and attempts to develop an analysis technology to support them. It uses a database system for supporting interactive analysis of large-scale data. Since the data gained via spatio-temporal simulations is not suitable for management in a relational DBMS (RDBMS), this study uses an array DBMS, a type of DBMS that has been garnering increased attention in recent years. An array DBMS is designed for the management of large-scale array data; it provides a logical model for array data, yet it also supports efficient query processing. SciDB is used as our specific array DBMS in this paper. This study targets disaster evacuation simulation data and demonstrates via experimentation that the query-processing functions offered by an array DBMS provide effective analysis support.

2017 ◽  
Vol 4 (9) ◽  
pp. 170413 ◽  
Author(s):  
Xiao Zhou ◽  
Desislava Hristova ◽  
Anastasios Noulas ◽  
Cecilia Mascolo ◽  
Max Sklar

Being able to assess the impact of government-led investment onto socio-economic indicators in cities has long been an important target of urban planning. However, owing to the lack of large-scale data with a fine spatio-temporal resolution, there have been limitations in terms of how planners can track the impact and measure the effectiveness of cultural investment in small urban areas. Taking advantage of nearly 4 million transition records for 3 years in London from a popular location-based social network service, Foursquare, we study how the socio-economic impact of government cultural expenditure can be detected and predicted. Our analysis shows that network indicators such as average clustering coefficient or centrality can be exploited to estimate the likelihood of local growth in response to cultural investment. We subsequently integrate these features in supervised learning models to infer socio-economic deprivation changes for London’s neighbourhoods. This research presents how geosocial and mobile services can be used as a proxy to track and predict socio-economic deprivation changes as government financial effort is put in developing urban areas and thus gives evidence and suggestions for further policymaking and investment optimization.


Soft Matter ◽  
2015 ◽  
Vol 11 (32) ◽  
pp. 6393-6402 ◽  
Author(s):  
M. Gregory Forest ◽  
Qi Wang ◽  
Ruhai Zhou

Large-scale simulations by the authors of the kinetic-hydrodynamic equations for active polar nematics revealed a variety of spatio-temporal attractors, including steady and unsteady, banded (1d) and cellular (2d) spatial patterns.


Author(s):  
M Asch ◽  
T Moore ◽  
R Badia ◽  
M Beck ◽  
P Beckman ◽  
...  

Over the past four years, the Big Data and Exascale Computing (BDEC) project organized a series of five international workshops that aimed to explore the ways in which the new forms of data-centric discovery introduced by the ongoing revolution in high-end data analysis (HDA) might be integrated with the established, simulation-centric paradigm of the high-performance computing (HPC) community. Based on those meetings, we argue that the rapid proliferation of digital data generators, the unprecedented growth in the volume and diversity of the data they generate, and the intense evolution of the methods for analyzing and using that data are radically reshaping the landscape of scientific computing. The most critical problems involve the logistics of wide-area, multistage workflows that will move back and forth across the computing continuum, between the multitude of distributed sensors, instruments and other devices at the networks edge, and the centralized resources of commercial clouds and HPC centers. We suggest that the prospects for the future integration of technological infrastructures and research ecosystems need to be considered at three different levels. First, we discuss the convergence of research applications and workflows that establish a research paradigm that combines both HPC and HDA, where ongoing progress is already motivating efforts at the other two levels. Second, we offer an account of some of the problems involved with creating a converged infrastructure for peripheral environments, that is, a shared infrastructure that can be deployed throughout the network in a scalable manner to meet the highly diverse requirements for processing, communication, and buffering/storage of massive data workflows of many different scientific domains. Third, we focus on some opportunities for software ecosystem convergence in big, logically centralized facilities that execute large-scale simulations and models and/or perform large-scale data analytics. We close by offering some conclusions and recommendations for future investment and policy review.


2019 ◽  
Vol 8 (9) ◽  
pp. 389
Author(s):  
Xinliang Liu ◽  
Yi Wang ◽  
Yong Li ◽  
Jinshui Wu

The integrated recognition of spatio-temporal characteristics (e.g., speed, interaction with surrounding areas, and driving forces) of urbanization facilitates regional comprehensive development. In this study, a large-scale data-driven approach was formed for exploring the township urbanization process. The approach integrated logistic models to quantify urbanization speed, partial triadic analysis to reveal dynamic relationships between rural population migration and urbanization, and random forest analysis to identify the response of urbanization to spatial driving forces. A typical subtropical town was chosen to verify the approach by quantifying the spatio-temporal process of township urbanization from 1933 to 2012. The results showed that (i) urbanization speed was well reflected by the changes of time-course areas of urban cores fitted by a four-parameter logistic equation (R2 = 0.95–1.00, p < 0.001), and the relatively fast and steady developing periods were also successfully predicted, respectively; (ii) the spatio-temporal sprawl of urban cores and their interactions with the surrounding rural residential areas were well revealed and implied that the town experienced different historically aggregating and splitting trajectories; and (iii) the key drivers (township merger, elevation and distance to roads, as well as population migration) were identified in the spatial sprawl of urban cores. Our findings proved that a comprehensive approach is powerful for quantifying the spatio-temporal characteristics of the urbanization process at the township level and emphasized the importance of applying long-term historical data when researching the urbanization process.


2016 ◽  
Vol 13 (117) ◽  
pp. 20160112 ◽  
Author(s):  
Patrick Smadbeck ◽  
Michael P. H. Stumpf

Development is a process that needs to be tightly coordinated in both space and time. Cell tracking and lineage tracing have become important experimental techniques in developmental biology and allow us to map the fate of cells and their progeny. A generic feature of developing and homeostatic tissues that these analyses have revealed is that relatively few cells give rise to the bulk of the cells in a tissue; the lineages of most cells come to an end quickly. Computational and theoretical biologists/physicists have, in response, developed a range of modelling approaches, most notably agent-based modelling. These models seem to capture features observed in experiments, but can also become computationally expensive. Here, we develop complementary genealogical models of tissue development that trace the ancestry of cells in a tissue back to their most recent common ancestors. We show that with both bounded and unbounded growth simple, but universal scaling relationships allow us to connect coalescent theory with the fractal growth models extensively used in developmental biology. Using our genealogical perspective, it is possible to study bulk statistical properties of the processes that give rise to tissues of cells, without the need for large-scale simulations.


Author(s):  
P. Baumann ◽  
V. Merticariu ◽  
A. Dumitru ◽  
D. Misev

With the unprecedented availability of continuously updated measured and generated data there is an immense potential for getting new and timely insights &ndash; yet, the value is not fully leveraged as of today. The quest is up for high-level service interfaces for dissecting datasets and rejoining them with other datasets &ndash; ultimately, to allow users to ask "any question, anytime, on any size" enabling them to "build their own product on the go". <br><br> With OGC Coverages, a concrete, interoperable data model has been established which unifies n-D spatio-temporal regular and irregular grids, point clouds, and meshes. The Web Coverage Service (WCS) suite provides versatile streamlined coverage functionality ranging from simple access to flexible spatio-temporal analytics. Flexibility and scalability of the WCS suite has been demonstrated in practice through massive services run by large-scale data centers. We present the current status in OGC Coverage data and service models, contrast them to related work, and describe a scalable implementation based on the rasdaman array engine.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
John Carter ◽  
Gokul Pathikonda ◽  
Naibo Jiang ◽  
Josef J. Felver ◽  
Sukesh Roy ◽  
...  

AbstractRecent developments of burst-mode lasers and imaging systems have opened new realms of simultaneous diagnostics for velocity and density fields at a rate of 1 kHz–1 MHz. These enable the exploration of previously unimaginable shock-driven turbulent flow fields that are of significant importance to problems in high-energy density physics. The current work presents novel measurements using simultaneous measurements of velocity and scalar fields at 60 kHz to investigate Richtmyer-Meshkov instability (RMI) in a spatio-temporal approach. The evolution of scalar fields and the vorticity dynamics responsible for the same are shown, including the interaction of shock with the interface. This temporal information is used to validate two vorticity-deposition models commonly used for initiation of large scale simulations, and have been previously validated only via simulations or integral measures of circulation. Additionally, these measurements also enable tracking the evolution and mode merging of individual flow structures that were previously not possible owing to inherently random variations in the interface at the smallest scales. A temporal evolution of symmetric vortex merging and the induced mixing prevalent in these problems is presented, with implications for the vortex paradigms in accelerated inhomogenous flows.


2015 ◽  
Vol 42 (4) ◽  
pp. 504-511
Author(s):  
HyunJo Lee ◽  
TaeHoon Kim ◽  
JaeWoo Chang

Author(s):  
P. Baumann ◽  
V. Merticariu ◽  
A. Dumitru ◽  
D. Misev

With the unprecedented availability of continuously updated measured and generated data there is an immense potential for getting new and timely insights &ndash; yet, the value is not fully leveraged as of today. The quest is up for high-level service interfaces for dissecting datasets and rejoining them with other datasets &ndash; ultimately, to allow users to ask "any question, anytime, on any size" enabling them to "build their own product on the go". &lt;br&gt;&lt;br&gt; With OGC Coverages, a concrete, interoperable data model has been established which unifies n-D spatio-temporal regular and irregular grids, point clouds, and meshes. The Web Coverage Service (WCS) suite provides versatile streamlined coverage functionality ranging from simple access to flexible spatio-temporal analytics. Flexibility and scalability of the WCS suite has been demonstrated in practice through massive services run by large-scale data centers. We present the current status in OGC Coverage data and service models, contrast them to related work, and describe a scalable implementation based on the rasdaman array engine.


Sign in / Sign up

Export Citation Format

Share Document