scholarly journals Analyzing the Impact of Traffic Congestion Mitigation: From an Explainable Neural Network Learning Framework to Marginal Effect Analyses

Sensors ◽  
2019 ◽  
Vol 19 (10) ◽  
pp. 2254 ◽  
Author(s):  
Jianping Sun ◽  
Jifu Guo ◽  
Xin Wu ◽  
Qian Zhu ◽  
Danting Wu ◽  
...  

Computational graphs (CGs) have been widely utilized in numerical analysis and deep learning to represent directed forward networks of data flows between operations. This paper aims to develop an explainable learning framework that can fully integrate three major steps of decision support: Synthesis of diverse traffic data, multilayered traffic demand estimation, and marginal effect analyses for transport policies. Following the big data-driven transportation computational graph (BTCG) framework, which is an emerging framework for explainable neural networks, we map different external traffic measurements collected from household survey data, mobile phone data, floating car data, and sensor networks to multilayered demand variables in a CG. Furthermore, we extend the CG-based framework by mapping different congestion mitigation strategies to CG layers individually or in combination, allowing the marginal effects and potential migration magnitudes of the strategies to be reliably quantified. Using the TensorFlow architecture, we evaluate our framework on the Sioux Falls network and present a large-scale case study based on a subnetwork of Beijing using a data set from the metropolitan planning organization.

2015 ◽  
Vol 8 (1) ◽  
pp. 421-434 ◽  
Author(s):  
M. P. Jensen ◽  
T. Toto ◽  
D. Troyan ◽  
P. E. Ciesielski ◽  
D. Holdridge ◽  
...  

Abstract. The Midlatitude Continental Convective Clouds Experiment (MC3E) took place during the spring of 2011 centered in north-central Oklahoma, USA. The main goal of this field campaign was to capture the dynamical and microphysical characteristics of precipitating convective systems in the US Central Plains. A major component of the campaign was a six-site radiosonde array designed to capture the large-scale variability of the atmospheric state with the intent of deriving model forcing data sets. Over the course of the 46-day MC3E campaign, a total of 1362 radiosondes were launched from the enhanced sonde network. This manuscript provides details on the instrumentation used as part of the sounding array, the data processing activities including quality checks and humidity bias corrections and an analysis of the impacts of bias correction and algorithm assumptions on the determination of convective levels and indices. It is found that corrections for known radiosonde humidity biases and assumptions regarding the characteristics of the surface convective parcel result in significant differences in the derived values of convective levels and indices in many soundings. In addition, the impact of including the humidity corrections and quality controls on the thermodynamic profiles that are used in the derivation of a large-scale model forcing data set are investigated. The results show a significant impact on the derived large-scale vertical velocity field illustrating the importance of addressing these humidity biases.


2017 ◽  
Vol 10 (5) ◽  
pp. 2031-2055 ◽  
Author(s):  
Thomas Schwitalla ◽  
Hans-Stefan Bauer ◽  
Volker Wulfmeyer ◽  
Kirsten Warrach-Sagi

Abstract. Increasing computational resources and the demands of impact modelers, stake holders, and society envision seasonal and climate simulations with the convection-permitting resolution. So far such a resolution is only achieved with a limited-area model whose results are impacted by zonal and meridional boundaries. Here, we present the setup of a latitude-belt domain that reduces disturbances originating from the western and eastern boundaries and therefore allows for studying the impact of model resolution and physical parameterization. The Weather Research and Forecasting (WRF) model coupled to the NOAH land–surface model was operated during July and August 2013 at two different horizontal resolutions, namely 0.03 (HIRES) and 0.12° (LOWRES). Both simulations were forced by the European Centre for Medium-Range Weather Forecasts (ECMWF) operational analysis data at the northern and southern domain boundaries, and the high-resolution Operational Sea Surface Temperature and Sea Ice Analysis (OSTIA) data at the sea surface.The simulations are compared to the operational ECMWF analysis for the representation of large-scale features. To analyze the simulated precipitation, the operational ECMWF forecast, the CPC MORPHing (CMORPH), and the ENSEMBLES gridded observation precipitation data set (E-OBS) were used as references.Analyzing pressure, geopotential height, wind, and temperature fields as well as precipitation revealed (1) a benefit from the higher resolution concerning the reduction of monthly biases, root mean square error, and an improved Pearson skill score, and (2) deficiencies in the physical parameterizations leading to notable biases in distinct regions like the polar Atlantic for the LOWRES simulation, the North Pacific, and Inner Mongolia for both resolutions.In summary, the application of a latitude belt on a convection-permitting resolution shows promising results that are beneficial for future seasonal forecasting.


2009 ◽  
Vol 2 (1) ◽  
pp. 87-98 ◽  
Author(s):  
C. Lerot ◽  
M. Van Roozendael ◽  
J. van Geffen ◽  
J. van Gent ◽  
C. Fayt ◽  
...  

Abstract. Total O3 columns have been retrieved from six years of SCIAMACHY nadir UV radiance measurements using SDOAS, an adaptation of the GDOAS algorithm previously developed at BIRA-IASB for the GOME instrument. GDOAS and SDOAS have been implemented by the German Aerospace Center (DLR) in the version 4 of the GOME Data Processor (GDP) and in version 3 of the SCIAMACHY Ground Processor (SGP), respectively. The processors are being run at the DLR processing centre on behalf of the European Space Agency (ESA). We first focus on the description of the SDOAS algorithm with particular attention to the impact of uncertainties on the reference O3 absorption cross-sections. Second, the resulting SCIAMACHY total ozone data set is globally evaluated through large-scale comparisons with results from GOME and OMI as well as with ground-based correlative measurements. The various total ozone data sets are found to agree within 2% on average. However, a negative trend of 0.2–0.4%/year has been identified in the SCIAMACHY O3 columns; this probably originates from instrumental degradation effects that have not yet been fully characterized.


2014 ◽  
Vol 7 (4) ◽  
pp. 5087-5139 ◽  
Author(s):  
R. Pommrich ◽  
R. Müller ◽  
J.-U. Grooß ◽  
P. Konopka ◽  
F. Ploeger ◽  
...  

Abstract. Variations in the mixing ratio of trace gases of tropospheric origin entering the stratosphere in the tropics are of interest for assessing both troposphere to stratosphere transport fluxes in the tropics and the impact of these transport fluxes on the composition of the tropical lower stratosphere. Anomaly patterns of carbon monoxide (CO) and long-lived tracers in the lower tropical stratosphere allow conclusions about the rate and the variability of tropical upwelling to be drawn. Here, we present a simplified chemistry scheme for the Chemical Lagrangian Model of the Stratosphere (CLaMS) for the simulation, at comparatively low numerical cost, of CO, ozone, and long-lived trace substances (CH4, N2O, CCl3F (CFC-11), CCl2F2 (CFC-12), and CO2) in the lower tropical stratosphere. For the long-lived trace substances, the boundary conditions at the surface are prescribed based on ground-based measurements in the lowest model level. The boundary condition for CO in the free troposphere is deduced from MOPITT measurements (at ≈ 700–200 hPa). Due to the lack of a specific representation of mixing and convective uplift in the troposphere in this model version, enhanced CO values, in particular those resulting from convective outflow are underestimated. However, in the tropical tropopause layer and the lower tropical stratosphere, there is relatively good agreement of simulated CO with in-situ measurements (with the exception of the TROCCINOX campaign, where CO in the simulation is biased low ≈ 10–20 ppbv). Further, the model results are of sufficient quality to describe large scale anomaly patterns of CO in the lower stratosphere. In particular, the zonally averaged tropical CO anomaly patterns (the so called "tape recorder" patterns) simulated by this model version of CLaMS are in good agreement with observations. The simulations show a too rapid upwelling compared to observations as a consequence of the overestimated vertical velocities in the ERA-interim reanalysis data set. Moreover, the simulated tropical anomaly patterns of N2O are in good agreement with observations. In the simulations, anomaly patterns for CH4 and CFC-11 were found to be consistent with those of N2O; for all long-lived tracers, positive anomalies are simulated because of the enhanced tropical upwelling in the easterly phase of the quasi-biennial oscillation.


2018 ◽  
Vol 11 (3) ◽  
pp. 57
Author(s):  
Xiao-Yan Cao ◽  
Bing-Qian Liu ◽  
Bao-Ru Pan ◽  
Yuan-Biao Zhang

With the accelerating development of urbanization in China, the increasing traffic demand and large scale gated communities have aggravated urban traffic congestion. This paper studies the impact of communities opening on road network structure and the surrounding road capacity. Firstly, we select four indicators, namely average speed, vehicle flow, average delay time, and queue length, to measure traffic capacity. Secondly, we establish the Wiedemann car-following model, then use VISSIM software to simulate the traffic conditions of surrounding roads of communities. Finally, we take Shenzhen as an example to simulate and compare the four kinds of gated communities, axis, centripetal and intensive layout, and we also analyze the feasibility of opening communities.


2018 ◽  
Vol 40 ◽  
pp. 05010
Author(s):  
Brian Perry ◽  
Colin Rennie ◽  
Andrew Cornett ◽  
Paul Knox

Due to excessive rainfall in June of 2013, several rivers located in and near the City of Calgary, Canada experienced significant flooding events. These events caused severe damage to infrastructure throughout the city, precipitating a renewed interest in flood control and mitigation strategies for the area. A major potential strategy involves partial diversion of Elbow River flood water to the proposed Springbank Off-Stream Storage Reservoir. A large scale physical model study was conducted to optimize and validate the design of a portion of the new project. The goals of the physical model were to investigate diversion system behaviors such as flow rates, water levels, sediment transport and, debris accumulation, and optimize the design of new flow control structures to be constructed on the Elbow River. In order to accurately represent the behavior of debris within the system due to flooding, large woody debris created from natural sources was utilized in the physical model and its performance was compared to that of debris of the same size fabricated from pressed cylindrical wood dowels. In addition to comparing the performance of these two debris types, the impact of root wads on debris damming was also investigated. Significant differences in damming behavior was shown to exist between the natural debris and the fabricated debris, while the impact of root wad on damming affected the dam structure and formation. The results of this experiment indicate that natural debris is preferred for studies involving debris accumulation.


2021 ◽  
Author(s):  
Bengt Ljungquist ◽  
Masood A Akram ◽  
Giorgio A Ascoli

Most functions of the nervous system depend on neuronal and glial morphology. Continuous advances in microscopic imaging and tracing software have provided an increasingly abundant availability of 3D reconstructions of arborizing dendrites, axons, and processes, allowing their detailed study. However, efficient, large-scale methods to rank neural morphologies by similarity to an archetype are still lacking. Using the NeuroMorpho.Org database, we present a similarity search software enabling fast morphological comparison of hundreds of thousands of neural reconstructions from any species, brain regions, cell types, and preparation protocols. We compared the performance of different morphological measurements: 1) summary morphometrics calculated by L-Measure, 2) persistence vectors, a vectorized descriptor of branching structure, 3) the combination of the two. In all cases, we also investigated the impact of applying dimensionality reduction using principal component analysis (PCA). We assessed qualitative performance by gauging the ability to rank neurons in order of visual similarity. Moreover, we quantified information content by examining explained variance and benchmarked the ability to identify occasional duplicate reconstructions of the same specimen. The results indicate that combining summary morphometrics and persistence vectors with applied PCA provides an information rich characterization that enables efficient and precise comparison of neural morphology. The execution time scaled linearly with data set size, allowing seamless live searching through the entire NeuroMorpho.Org content in fractions of a second. We have deployed the similarity search function as an open-source online software tool both through a user-friendly graphical interface and as an API for programmatic access.


Trials ◽  
2019 ◽  
Vol 20 (1) ◽  
Author(s):  
Jessica E. Lockery ◽  
◽  
Taya A. Collyer ◽  
Christopher M. Reid ◽  
Michael E. Ernst ◽  
...  

Abstract Background Large-scale studies risk generating inaccurate and missing data due to the complexity of data collection. Technology has the potential to improve data quality by providing operational support to data collectors. However, this potential is under-explored in community-based trials. The Aspirin in reducing events in the elderly (ASPREE) trial developed a data suite that was specifically designed to support data collectors: the ASPREE Web Accessible Relational Database (AWARD). This paper describes AWARD and the impact of system design on data quality. Methods AWARD’s operational requirements, conceptual design, key challenges and design solutions for data quality are presented. Impact of design features is assessed through comparison of baseline data collected prior to implementation of key functionality (n = 1000) with data collected post implementation (n = 18,114). Overall data quality is assessed according to data category. Results At baseline, implementation of user-driven functionality reduced staff error (from 0.3% to 0.01%), out-of-range data entry (from 0.14% to 0.04%) and protocol deviations (from 0.4% to 0.08%). In the longitudinal data set, which contained more than 39 million data values collected within AWARD, 96.6% of data values were entered within specified query range or found to be accurate upon querying. The remaining data were missing (3.4%). Participant non-attendance at scheduled study activity was the most common cause of missing data. Costs associated with cleaning data in ASPREE were lower than expected compared with reports from other trials. Conclusions Clinical trials undertake complex operational activity in order to collect data, but technology rarely provides sufficient support. We find the AWARD suite provides proof of principle that designing technology to support data collectors can mitigate known causes of poor data quality and produce higher-quality data. Health information technology (IT) products that support the conduct of scheduled activity in addition to traditional data entry will enhance community-based clinical trials. A standardised framework for reporting data quality would aid comparisons across clinical trials. Trial registration International Standard Randomized Controlled Trial Number Register, ISRCTN83772183. Registered on 3 March 2005.


2016 ◽  
Vol 40 (6) ◽  
pp. 536-543 ◽  
Author(s):  
Theodore D. Wachs ◽  
Santiago Cueto ◽  
Haogen Yao

Studies from both high and low-middle income (LAMI) countries have documented how being reared in poverty is linked to compromised child development. Links between poverty and development are mediated by the timing and extent of exposure to both risk factors nested under poverty and to protective influences which can attenuate the impact of risk. While children from high-, middle-, and low-income countries are exposed to similar types of developmental risks, children from low- and middle-income countries are exposed to a greater number, more varied and more intense risks. Given these contextual differences, cumulative risk models may provide a better fit than mediated models for understanding the nature of pathways linking economic insufficiency and developmental inequality in low- and middle-income countries, and for designing interventions to promote development of children from these countries. New evidence from a large scale UNICEF data set illustrates the application of a cumulative risk/protective perspective in low- and middle-income countries.


Author(s):  
Ryosuke Abe ◽  
Kay W. Axhausen

This study estimates the impact of major road supply on individual travel time expenditures (TTEs) using data that cover 30-year variations in transportation infrastructure and travel behavior. The impacts of the supply of road and rail infrastructure are estimated with a data set that combines records of large-scale household travel surveys in the Tokyo metropolitan area conducted in 1978, 1988, 1998, and 2008. Linear and Tobit models of individual TTEs are estimated by following the behavior of birth cohorts over the 30-year period. The models incorporate the changes in transportation infrastructure, measured as lane kilometers of two levels of major road stock and vehicle kilometers of urban rail service. The results show significant negative effects of lane kilometers for higher-level and lower-level major roads on the TTEs for all travel purposes and for commuting, after controlling for socioeconomic backgrounds and generations of individuals. This study discusses that, in Tokyo, the estimated effect is more likely to reflect the effect of a major road network per se on individual TTEs than the (indirect) effect of major road supply on individual TTEs working through land development activities (i.e., induced car travel demand). For example, the caveat is that actual road investment decisions still need to consider the induced component of road traffic in addition to the (direct) effect that is estimated in this study.


Sign in / Sign up

Export Citation Format

Share Document