scholarly journals Cultural investment and urban socio-economic development: a geosocial network approach

2017 ◽  
Vol 4 (9) ◽  
pp. 170413 ◽  
Author(s):  
Xiao Zhou ◽  
Desislava Hristova ◽  
Anastasios Noulas ◽  
Cecilia Mascolo ◽  
Max Sklar

Being able to assess the impact of government-led investment onto socio-economic indicators in cities has long been an important target of urban planning. However, owing to the lack of large-scale data with a fine spatio-temporal resolution, there have been limitations in terms of how planners can track the impact and measure the effectiveness of cultural investment in small urban areas. Taking advantage of nearly 4 million transition records for 3 years in London from a popular location-based social network service, Foursquare, we study how the socio-economic impact of government cultural expenditure can be detected and predicted. Our analysis shows that network indicators such as average clustering coefficient or centrality can be exploited to estimate the likelihood of local growth in response to cultural investment. We subsequently integrate these features in supervised learning models to infer socio-economic deprivation changes for London’s neighbourhoods. This research presents how geosocial and mobile services can be used as a proxy to track and predict socio-economic deprivation changes as government financial effort is put in developing urban areas and thus gives evidence and suggestions for further policymaking and investment optimization.

Author(s):  
Krzysztof Jurczuk ◽  
Marcin Czajkowski ◽  
Marek Kretowski

AbstractThis paper concerns the evolutionary induction of decision trees (DT) for large-scale data. Such a global approach is one of the alternatives to the top-down inducers. It searches for the tree structure and tests simultaneously and thus gives improvements in the prediction and size of resulting classifiers in many situations. However, it is the population-based and iterative approach that can be too computationally demanding to apply for big data mining directly. The paper demonstrates that this barrier can be overcome by smart distributed/parallel processing. Moreover, we ask the question whether the global approach can truly compete with the greedy systems for large-scale data. For this purpose, we propose a novel multi-GPU approach. It incorporates the knowledge of global DT induction and evolutionary algorithm parallelization together with efficient utilization of memory and computing GPU’s resources. The searches for the tree structure and tests are performed simultaneously on a CPU, while the fitness calculations are delegated to GPUs. Data-parallel decomposition strategy and CUDA framework are applied. Experimental validation is performed on both artificial and real-life datasets. In both cases, the obtained acceleration is very satisfactory. The solution is able to process even billions of instances in a few hours on a single workstation equipped with 4 GPUs. The impact of data characteristics (size and dimension) on convergence and speedup of the evolutionary search is also shown. When the number of GPUs grows, nearly linear scalability is observed what suggests that data size boundaries for evolutionary DT mining are fading.


2021 ◽  
Author(s):  
Andreas Baas

<p>Sand transport by wind over granular beds displays dynamic structure and organisation in the form of streamers (aka ‘sand snakes’) that appear, meander and intertwine, and then dissipate as they are advected downwind. These patterns of saltating grain populations are thought to be initiated and controlled by coherent flow structures in the turbulent boundary layer wind that scrape over the bed surface raking up sand into entrainment. Streamer behaviour is thus fundamental to understanding sand transport dynamics, in particular its strong spatio-temporal variability, and is equally relevant to granular transport in other geophysical flows (fluvial, submarine).</p><p>This paper presents findings on streamer dynamics and associated wind turbulence observed in a field experiment on a beach, with measurements from 30Hz video-imagery using Large-Scale Particle Image Velocimetry (LS-PIV), combined with 50Hz wind measurements from 3D sonic anemometry and co-located sand transport rate monitoring using an array of laser particle counters (‘Wenglors’), all taking place over an area of ~10 m<sup>2</sup> and over periods of several minutes. The video imagery was used to identify when and where streamers advected past the sonic anemometer and laser sensors so that relationships could be detected between the passage of turbulence structures in the airflow and the length- and time-scales, propagation speeds, and sand transport intensities of associated streamers. The findings form the basis for a phenomenological model of streamer dynamics under turbulent boundary layer flows that predicts the impact of spatio-temporal variability on local measurement of sand transport.</p>


Author(s):  
Nenad Jukic ◽  
Miguel Velasco

Defining data warehouse requirements is widely recognized as one of the most important steps in the larger data warehouse system development process. This paper examines the potential risks and pitfalls within the data warehouse requirement collection and definition process. A real scenario of a large-scale data warehouse implementation is given, and details of this project, which ultimately failed due to inadequate requirement collection and definition process, are described. The presented case underscores and illustrates the impact of the requirement collection and definition process on the data warehouse implementation, while the case is analyzed within the context of the existing approaches, methodologies, and best practices for prevention and avoidance of typical data warehouse requirement errors and oversights.


Energies ◽  
2019 ◽  
Vol 12 (5) ◽  
pp. 810 ◽  
Author(s):  
Antonio Barragán-Escandón ◽  
Esteban Zalamea-León ◽  
Julio Terrados-Cepeda

Previous research has assessed the potential of solar energy against possible demand; however, the sustainability issues associated with the use of large-scale photovoltaic deployment in urban areas have not been jointly established. In this paper, the impact of photovoltaic energy in the total urban energy mix is estimated using a series of indicators that consider the economic, environmental and social dimensions. These indicators have been previously applied at the country level; the main contribution of this research is applying them at the urban level to the city of Cuenca, Ecuador. Cuenca is close to the equatorial line and at a high altitude, enabling this area to reach the maximum self-supply index because of the high irradiation levels and reduced demand. The solar potential was estimated using a simple methodology that applies several indexes that were proven reliable in a local context considering this particular sun path. The results demonstrate that the solar potential can meet the electric power demand of this city, and only the indicator related to employment is positive and substantially affected. The indicators related to the price of energy, emissions and fossil fuel dependency do not change significantly, unless a fuel-to-electricity transport system conversions take place.


2010 ◽  
Vol 1 (3) ◽  
pp. 66-76
Author(s):  
Nenad Jukic ◽  
Miguel Velasco

Defining data warehouse requirements is widely recognized as one of the most important steps in the larger data warehouse system development process. This paper examines the potential risks and pitfalls within the data warehouse requirement collection and definition process. A real scenario of a large-scale data warehouse implementation is given, and details of this project, which ultimately failed due to inadequate requirement collection and definition process, are described. The presented case underscores and illustrates the impact of the requirement collection and definition process on the data warehouse implementation, while the case is analyzed within the context of the existing approaches, methodologies, and best practices for prevention and avoidance of typical data warehouse requirement errors and oversights.


Author(s):  
Anne H Klein ◽  
Kaylene R Ballard ◽  
Kenneth B Storey ◽  
Cherie A Motti ◽  
Min Zhao ◽  
...  

Abstract Gastropods are the largest and most diverse class of mollusc and include species that are well studied within the areas of taxonomy, aquaculture, biomineralization, ecology, microbiome and health. Gastropod research has been expanding since the mid-2000s, largely due to large-scale data integration from next-generation sequencing and mass spectrometry in which transcripts, proteins and metabolites can be readily explored systematically. Correspondingly, the huge data added a great deal of complexity for data organization, visualization and interpretation. Here, we reviewed the recent advances involving gastropod omics (‘gastropodomics’) research from hundreds of publications and online genomics databases. By summarizing the current publicly available data, we present an insight for the design of useful data integrating tools and strategies for comparative omics studies in the future. Additionally, we discuss the future of omics applications in aquaculture, natural pharmaceutical biodiscovery and pest management, as well as to monitor the impact of environmental stressors.


2018 ◽  
Vol 13 (2) ◽  
pp. 338-346
Author(s):  
Yusuke Kawai ◽  
Jing Zhao ◽  
Kento Sugiura ◽  
Yoshiharu Ishikawa ◽  
Yukiko Wakita ◽  
...  

Today, large-scale simulations are thriving because of the increase of computating performance and storage capacity. Understanding the results of these simulations is not easy, and hence, support for interactive and exploratory analysis is becoming more important. This study focuses on spatio-temporal simulations and attempts to develop an analysis technology to support them. It uses a database system for supporting interactive analysis of large-scale data. Since the data gained via spatio-temporal simulations is not suitable for management in a relational DBMS (RDBMS), this study uses an array DBMS, a type of DBMS that has been garnering increased attention in recent years. An array DBMS is designed for the management of large-scale array data; it provides a logical model for array data, yet it also supports efficient query processing. SciDB is used as our specific array DBMS in this paper. This study targets disaster evacuation simulation data and demonstrates via experimentation that the query-processing functions offered by an array DBMS provide effective analysis support.


Author(s):  
Gordon Bell ◽  
David H Bailey ◽  
Jack Dongarra ◽  
Alan H Karp ◽  
Kevin Walsh

The Gordon Bell Prize is awarded each year by the Association for Computing Machinery to recognize outstanding achievement in high-performance computing (HPC). The purpose of the award is to track the progress of parallel computing with particular emphasis on rewarding innovation in applying HPC to applications in science, engineering, and large-scale data analytics. Prizes may be awarded for peak performance or special achievements in scalability and time-to-solution on important science and engineering problems. Financial support for the US$10,000 award is provided through an endowment by Gordon Bell, a pioneer in high-performance and parallel computing. This article examines the evolution of the Gordon Bell Prize and the impact it has had on the field.


2020 ◽  
Vol 20 (17) ◽  
pp. 10667-10686
Author(s):  
Martin O. P. Ramacher ◽  
Lin Tang ◽  
Jana Moldanová ◽  
Volker Matthias ◽  
Matthias Karl ◽  
...  

Abstract. Shipping is an important source of air pollutants, from the global to the local scale. Ships emit substantial amounts of sulfur dioxides, nitrogen dioxides, and particulate matter in the vicinity of coasts, threatening the health of the coastal population, especially in harbour cities. Reductions in emissions due to shipping have been targeted by several regulations. Nevertheless, effects of these regulations come into force with temporal delays, global ship traffic is expected to grow in the future, and other land-based anthropogenic emissions might decrease. Thus, it is necessary to investigate combined impacts to identify the impact of shipping activities on air quality, population exposure, and health effects in the future. We investigated the future effect of shipping emissions on air quality and related health effects considering different scenarios of the development of shipping under current regional trends of economic growth and already decided regulations in the Gothenburg urban area in 2040. Additionally, we investigated the impact of a large-scale implementation of shore electricity in the Port of Gothenburg. For this purpose, we established a one-way nested chemistry transport modelling (CTM) system from the global to the urban scale, to calculate pollutant concentrations, population-weighted concentrations, and health effects related to NO2, PM2.5, and O3. The simulated concentrations of NO2 and PM2.5 in future scenarios for the year 2040 are in general very low with up to 4 ppb for NO2 and up to 3.5 µg m−3 PM2.5 in the urban areas which are not close to the port area. From 2012 the simulated overall exposure to PM2.5 decreased by approximately 30 % in simulated future scenarios; for NO2 the decrease was over 60 %. The simulated concentrations of O3 increased from the year 2012 to 2040 by about 20 %. In general, the contributions of local shipping emissions in 2040 focus on the harbour area but to some extent also influence the rest of the city domain. The simulated impact of onshore electricity implementation for shipping in 2040 shows reductions for NO2 in the port of up to 30 %, while increasing O3 of up to 3 %. Implementation of onshore electricity for ships at berth leads to additional local reduction potentials of up to 3 % for PM2.5 and 12 % for SO2 in the port area. All future scenarios show substantial decreases in population-weighted exposure and health-effect impacts.


Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Lingbo Li ◽  
Ying Fan ◽  
An Zeng ◽  
Zengru Di

The novel coronavirus (COVID-19) pandemic is intensifying all over the world, but some countries, including China, have developed extensive and successful experience in controlling this pandemic. In this context, some questions arise naturally: What can countries caught up in the epidemic learn from China’s experience? In regions where the outbreak is under control, what would lead to a resurgence of the epidemic? To address these issues, we investigate China’s experience in anticontagion interventions and reopening process, focusing on the coevolution of epidemic and awareness during COVID-19 outbreak. Through an empirical analysis based on large-scale data and simulation based on a metapopulation and multilayer network model, we ascertain the impact of human movements and awareness diffusion on the epidemic, elucidate the inherent patterns and effective interventions of different epidemic prevention methods, and highlight the crunch time of each measure. The results are also employed to analyze COVID-19 evolution in other countries so as to find unified rules in complex situations around the world and provide advice on anticontagion and reopening policies. Our findings explain some key mechanisms of epidemic prevention and may help the epidemic analysis and decision-making in various countries suffering from COVID-19.


Sign in / Sign up

Export Citation Format

Share Document