Cloud-Based Scalable Software for Optimal Long-Range, Network-Level Bridge Improvement Programming

2017 ◽  
Vol 2612 (1) ◽  
pp. 132-140
Author(s):  
Mahmoud R. Halfawy

The current state of the practice in bridge management highlights a growing need to develop scalable optimization software tools to support the development of truly optimal bridge improvement programs and ensure that limited financial resources are optimally allocated. The heuristic project selection approaches employed in today’s bridge management systems are not capable of generating optimal programs. Agencies that rely on suboptimal programs may inadvertently direct a significant portion of their budget to the wrong projects, leading to an increase in maintenance backlogs and overall system risk levels. The subjective project selection criteria may also hinder the ability to quantify project benefits or justify projects to funding agencies and stakeholders. This paper presents a novel dynamic programming–based multiobjective optimization approach that is capable of generating global optimal network-level, long-range bridge improvement programs. The algorithm considers three objectives: the minimization of system-level risk, the maximization of system-level condition, and the minimization of life-cycle costs, subject to agency-defined constraints and planning scenarios. The algorithm efficiently explores the enormous search space to find optimal project lists for each year in the planning horizon under any given scenario. Alternative planning scenarios are defined to quantify the impact of different investment levels on system-level performance metrics and to determine the investment required to achieve the desired performance and risk targets.

Author(s):  
K. K. Botros ◽  
C. Foy ◽  
B. Chmilar

Dynamic programming (DP) inherently provides a methodology for evaluating a series of decisions in order to determine an optimal policy or path forward. The methodology basically enumerates and evaluates alternative states over the planning horizon in formulating the optimum strategy. In the present work, the concept of DP has been applied to pipeline long-range facility planning problems, and further extended to allow evaluation of nth optimum pipeline facility deployments based on cost and/or probabilities of constraints. The best four options were further analyzed considering uncertainties in the cost elements and the resulting economic risk associated with each optimum path. This paper presents the theory behind the extension of the DP methodology to pipeline long-range facility-planning problems over a planning horizon that considers inherent uncertainties in gas supply and demand as well as a range of available facility options. Uncertainties in the size and location of the required facilities to handle the forecast volumes, and associated variances in their respective cost to build and operate the various facilities, are all accounted for. The problem is further complicated by the possible changes in the expected flow from that forecast during design and the resulting penalties associated with the under- or over-sizing of facilities. It was demonstrated that it is important that the off-design flow forecast be evaluated to determine the impact of future variability or changes. The value that the organization can derive from being able to quantify the benefit (or penalty) of forecast uncertainty and over- or under-building long-range facilities, is significant.


Author(s):  
Basil Ezeribe

Abstract: Network providers of LTE networks can achieve maximum gain and Quality of Service (QoS) requirement of their users by employing a radio resource management technique that has the ability to allocate resource blocks to users in a fair manner without compromising the capacity of the network. This implies that for a better performing LTE network, a fair scheduling and balanced QoS delivery for various forms of traffic are needed. In this paper an improved proportional fair scheduling algorithm for downlink LTE cellular network has been developed. This algorithm was implemented using a MATLAB-based System Level simulator by Vienna University. The developed algorithm was compared to other scheduling algorithms such as the Proportional Fair (PF) algorithm, Best Channel Quality Indicator (CQI), and Round Robin (RR) scheduling methods. The system performance was also analyzed under different scenarios using different performance metrics. The achieved results showed that the developed algorithm had a better throughput performance than the Round Robin and Proportional fair scheduling. The developed algorithm shows improved cell edge throughputs of about 19.2% (as at 20 users) and 9.1% higher for cell edge users without and with mobility impact respectively. The Best CQI algorithm had higher peak throughput values but the fairness was highly compromised. The developed algorithm outperforms the Best CQI by 136.6% without the impact of mobility. Finally, in dense conditions, the developed algorithm still outperforms the other algorithms with a QoS metric of 4.6% increment when compared to the PF algorithm which was the closest competitor. Keywords: UE, eNodeB, Scheduling, Proportional Fair, LTE,


2010 ◽  
Vol 26 (04) ◽  
pp. 273-289 ◽  
Author(s):  
N. Vlahopoulos ◽  
C. G. Hart

A multidisciplinary design optimization (MDO) framework is used for a conceptual submarine design study. Four discipline-level performances—internal deck area, powering, maneuvering, and structural analysis—are optimized simultaneously. The four discipline-level optimizations are driven by a system level optimization that minimizes the manufacturing cost while at the same time coordinates the exchange of information and the interaction among the discipline-level optimizations. Thus, the interaction among individual optimizations is captured along with the impact of the physical characteristics of the design on the manufacturing cost. A geometric model for the internal deck area of a submarine is created, and resistance, structural design, and maneuvering models are adapted from theoretical information available in the literature. These models are employed as simulation drivers in the discipline-level optimizations. Commercial cost-estimating software is leveraged to create a sophisticated, automated affordability model for the fabrication of a submarine pressure hull at the system level. First, each one of the four discipline optimizations and also the cost-related top level optimization are performed independently. As expected, five different design configurations result, one from each analysis. These results represent the "best" solution from each individual discipline optimization, and they are used as reference for comparison with the MDO solution. The deck area, resistance, structural, maneuvering, and affordability models are then synthesized into a multidisciplinary optimization statement reflecting a conceptual submarine design problem. The results from this coordinated MDO capture the interaction among disciplines and demonstrate the value that the MDO system offers in consolidating the results to a single design that improves the discipline-level objective functions while at the same time produces the highest possible improvement at the system level.


Author(s):  
Kyungsoo Jeong ◽  
Venu Garikapati ◽  
Yi Hou ◽  
Alicia Birky ◽  
Kevin Walkowicz

Freight travel accounts for a major share of the energy consumed in the transportation sector in any country, and the United States is no exception. Understanding and modeling freight movement are critical, particularly in the context of capturing the impact of emerging technologies on freight travel and its externalities. The domain of freight modeling and forecasting has been gaining pace in recent years, but advancement in comprehensive freight performance metrics is still lagging. Conventional freight performance metrics such as truck-miles, ton-miles, or value-miles are unidimensional and aggregate in nature, making them unsuitable to accurately capture the impact of emerging transportation trends on the performance or productivity of freight systems. Addressing the research need, this paper presents the “Freight Mobility Energy Productivity” metric to quantify freight productivity of current as well as future freight systems, accounting for various costs associated with freight transport. The proposed metric was implemented using data from the Freight Analysis Framework along with other published sources, and shows intuitive results in quantifying freight productivity. Further, a scenario analysis exercise was conducted to test the capability of the metric in tracking improvements in system-level freight productivity as a result of vehicle electrification. The relative differences in Freight Mobility Energy Productivity scores help identify which zones benefit from the vehicle powertrain technology improvement. The results of the scenario analysis reinforce confidence that the proposed metric can be used as a decision support tool in assessing the productivity of existing as well as future freight trends and technologies.


2021 ◽  
Vol 2021 ◽  
pp. 1-18
Author(s):  
Rony Kumer Saha

In this paper, we first give an overview of the coexistence of cellular with IEEE 802.11 technologies in the unlicensed bands. We then present a coexistence mechanism for Fifth-Generation (5G) New Radio on Unlicensed (NR-U) small cells located within buildings to coexist with the IEEE 802.11ad/ay, also termed as Wireless Gigabit (WiGig). Small cells are dual-band enabled operating in the 60 GHz unlicensed and 28 GHz licensed millimeter-wave (mmW) bands. We develop an interference avoidance scheme in the time domain to avoid cochannel interference (CCI) between in-building NR-U small cells and WiGig access points (APs). We then derive average capacity, spectral efficiency (SE), and energy efficiency (EE) performance metrics of in-building small cells. Extensive system-level numerical and simulation results and analyses are carried out for a number of variants of NR-U, including NR standalone, NR-U standalone, and NR-U anchored. We also analyze the impact of the spatial reuse of both mmW spectra of multiple NR-U anchored operators with a WiGig operator. It is shown that NR-U anchored provides the best average capacity and EE performances, whereas NR-U standalone provides the best SE performance. Moreover, both vertical spatial reuse intrabuilding level and horizontal spatial reuse interbuilding level of mmW spectra in small cells of an NR-U anchored can improve its SE and EE performances. Finally, we show that by choosing appropriate values of vertical and horizontal spatial reuse factors, the proposed coexistence mechanism can achieve the expected SE and EE requirements for the future Sixth-Generation (6G) mobile networks.


Author(s):  
Nan Zheng ◽  
Takao Dantsuji ◽  
Pengfei Wang ◽  
Nikolas Geroliminis

Although multimodality has been widely studied in the literature, planning and operating bus lanes in congested urban city centers are still challenging topics for researchers and policy makers. Most existing approaches lack quantitative methods for estimating the impact of bus lanes or for optimizing the operation of bus lanes at a system level. This paper proposes a novel optimization approach for allocating road space to bus lanes in cities. The approach determines the optimal space share between the modes in service and allocates the bus lanes by integrating strategies that lead to less total travel cost. By relying on recent advances in network-level traffic flow modeling, namely, the multimodal macroscopic fundamental diagram (mMFD), the approach captures multimodal traffic dynamics and travel costs by mode. The impact of a bus lane on mode usage is taken into account to aggregated mode shift phenomena under changes in layout of dedicated bus lanes. Simulation was performed in a Swiss city network to test the proposed optimization approach. The research found that ( a) the mMFD could be properly integrated to decide for road space optimization of large-scale multimodal urban networks, ( b) an optimal and efficient space share minimized the total travel cost for all users, and ( c) the best strategy for the studied network was to implement the allocated space on the connected links on a corridor rather than to assign them sparsely to the links that are heavily congested.


Author(s):  
M. S. Bugaeva ◽  
O. I. Bondarev ◽  
N. N. Mikhailova ◽  
L. G. Gorokhova

Introduction. The impact on the body of such factors of the production environment as coal-rock dust and fluorine compounds leads to certain shift s in strict indicators of homeostasis at the system level. Maintaining the relative constancy of the internal environment of the body is provided by the functional consistency of all organs and systems, the leading of which is the liver. Organ repair plays a crucial role in restoring the structure of genetic material and maintaining normal cell viability. When this mechanism is damaged, the compensatory capabilities of the organ are disrupted, homeostasis is disrupted at the cellular and organizational levels, and the development of the main pathological processes is noted.The aim of the study is to compare the morphological mechanisms of maintaining structural homeostasis of the liver in the dynamics of the impact on the body of coal-rock dust and sodium fluoride.Materials and methods. Experimental studies were conducted on adult white male laboratory rats. Features of morphological mechanisms for maintaining structural homeostasis of the liver in the dynamics of exposure to coal-rock dust and sodium fluoride were studied on experimental models of pneumoconiosis and fluoride intoxication. For histological examination in experimental animals, liver sampling was performed after 1, 3, 6, 9, 12 weeks of the experiment.Results. The specificity of morphological changes in the liver depending on the harmful production factor was revealed. It is shown that chronic exposure to coal-rock dust and sodium fluoride is characterized by the development of similar morphological changes in the liver and its vessels from the predominance of the initial compensatory-adaptive to pronounced violations of the stromal and parenchymal components. Long-term inhalation of coal-rock dust at 1–3 weeks of seeding triggers adaptive mechanisms in the liver in the form of increased functional activity of cells, formation of double-core hepatocytes, activation of immunocompetent cells and endotheliocytes, ensuring the preservation of the parenchyma and the general morphostructure of the organ until the 12th week of the experiment. Exposure to sodium fluoride leads to early disruption of liver compensatory mechanisms and the development of dystrophic changes in the parenchyma with the formation of necrosis foci as early as the 6th week of the experiment.Conclusions. The study of mechanisms for compensating the liver structure in conditions of long-term exposure to coal-rock dust and sodium fluoride, as well as processes that indicate their failure, and the timing of their occurrence, is of theoretical and practical importance for developing recommendations for the timely prevention and correction of pathological conditions developing in employees of the aluminum and coal industry.The authors declare no conflict of interests.


2017 ◽  
Vol 21 (3) ◽  
pp. 1573-1591 ◽  
Author(s):  
Louise Crochemore ◽  
Maria-Helena Ramos ◽  
Florian Pappenberger ◽  
Charles Perrin

Abstract. Many fields, such as drought-risk assessment or reservoir management, can benefit from long-range streamflow forecasts. Climatology has long been used in long-range streamflow forecasting. Conditioning methods have been proposed to select or weight relevant historical time series from climatology. They are often based on general circulation model (GCM) outputs that are specific to the forecast date due to the initialisation of GCMs on current conditions. This study investigates the impact of conditioning methods on the performance of seasonal streamflow forecasts. Four conditioning statistics based on seasonal forecasts of cumulative precipitation and the standardised precipitation index were used to select relevant traces within historical streamflows and precipitation respectively. This resulted in eight conditioned streamflow forecast scenarios. These scenarios were compared to the climatology of historical streamflows, the ensemble streamflow prediction approach and the streamflow forecasts obtained from ECMWF System 4 precipitation forecasts. The impact of conditioning was assessed in terms of forecast sharpness (spread), reliability, overall performance and low-flow event detection. Results showed that conditioning past observations on seasonal precipitation indices generally improves forecast sharpness, but may reduce reliability, with respect to climatology. Conversely, conditioned ensembles were more reliable but less sharp than streamflow forecasts derived from System 4 precipitation. Forecast attributes from conditioned and unconditioned ensembles are illustrated for a case of drought-risk forecasting: the 2003 drought in France. In the case of low-flow forecasting, conditioning results in ensembles that can better assess weekly deficit volumes and durations over a wider range of lead times.


2021 ◽  
Vol 17 (4) ◽  
pp. 1-26
Author(s):  
Md Musabbir Adnan ◽  
Sagarvarma Sayyaparaju ◽  
Samuel D. Brown ◽  
Mst Shamim Ara Shawkat ◽  
Catherine D. Schuman ◽  
...  

Spiking neural networks (SNN) offer a power efficient, biologically plausible learning paradigm by encoding information into spikes. The discovery of the memristor has accelerated the progress of spiking neuromorphic systems, as the intrinsic plasticity of the device makes it an ideal candidate to mimic a biological synapse. Despite providing a nanoscale form factor, non-volatility, and low-power operation, memristors suffer from device-level non-idealities, which impact system-level performance. To address these issues, this article presents a memristive crossbar-based neuromorphic system using unsupervised learning with twin-memristor synapses, fully digital pulse width modulated spike-timing-dependent plasticity, and homeostasis neurons. The implemented single-layer SNN was applied to a pattern-recognition task of classifying handwritten-digits. The performance of the system was analyzed by varying design parameters such as number of training epochs, neurons, and capacitors. Furthermore, the impact of memristor device non-idealities, such as device-switching mismatch, aging, failure, and process variations, were investigated and the resilience of the proposed system was demonstrated.


2021 ◽  
Vol 2 (1) ◽  
Author(s):  
Agnes T. Black ◽  
Marla Steinberg ◽  
Amanda E. Chisholm ◽  
Kristi Coldwell ◽  
Alison M. Hoens ◽  
...  

Abstract Background The KT Challenge program supports health care professionals to effectively implement evidence-based practices. Unlike other knowledge translation (KT) programs, this program is grounded in capacity building, focuses on health care professionals (HCPs), and uses a multi-component intervention. This study presents the evaluation of the KT Challenge program to assess the impact on uptake, KT capacity, and practice change. Methods The evaluation used a mixed-methods retrospective pre-post design involving surveys and review of documents such as teams’ final reports. Online surveys collecting both quantitative and qualitative data were deployed at four time points (after both workshops, 6 months into implementation, and at the end of the 2-year funded projects) to measure KT capacity (knowledge, skills, and confidence) and impact on practice change. Qualitative data was analyzed using a general inductive approach and quantitative data was analyzed using non-parametric statistics. Results Participants reported statistically significant increases in knowledge and confidence across both workshops, at the 6-month mark of their projects, and at the end of their projects. In addition, at the 6-month check-in, practitioners reported statistically significant improvements in their ability to implement practice changes. In the first cohort of the program, of the teams who were able to complete their projects, half were able to show demonstrable practice changes. Conclusions The KT Challenge was successful in improving the capacity of HCPs to implement evidence-based practice changes and has begun to show demonstrable improvements in a number of practice areas. The program is relevant to a variety of HCPs working in diverse practice settings and is relatively inexpensive to implement. Like all practice improvement programs in health care settings, a number of challenges emerged stemming from the high turnover of staff and the limited capacity of some practitioners to take on anything beyond direct patient care. Efforts to address these challenges have been added to subsequent cohorts of the program and ongoing evaluation will examine if they are successful. The KT Challenge program has continued to garner great interest among practitioners, even in the midst of dealing with the COVID-19 pandemic, and shows promise for organizations looking for better ways to mobilize knowledge to improve patient care and empower staff. This study contributes to the implementation science literature by providing a description and evaluation of a new model for embedding KT practice skills in health care settings.


Sign in / Sign up

Export Citation Format

Share Document