scholarly journals Fortran Coarray Implementation of Semi-Lagrangian Convected Air Particles within an Atmospheric Model

2021 ◽  
Vol 5 (2) ◽  
pp. 21
Author(s):  
Soren Rasmussen ◽  
Ethan D. Gutmann ◽  
Irene Moulitsas ◽  
Salvatore Filippone

This work added semi-Lagrangian convected air particles to the Intermediate Complexity Atmospheric Research (ICAR) model. The ICAR model is a simplified atmospheric model using quasi-dynamical downscaling to gain performance over more traditional atmospheric models. The ICAR model uses Fortran coarrays to split the domain amongst images and handle the halo region communication of the image’s boundary regions. The newly implemented convected air particles use trilinear interpolation to compute initial properties from the Eulerian domain and calculate humidity and buoyancy forces as the model runs. This paper investigated the performance cost and scaling attributes of executing unsaturated and saturated air particles versus the original particle-less model. An in-depth analysis was done on the communication patterns and performance of the semi-Lagrangian air particles, as well as the performance cost of a variety of initial conditions such as wind speed and saturation mixing ratios. This study found that given a linear increase in the number of particles communicated, there is an initial decrease in performance, but that it then levels out, indicating that over the runtime of the model, there is an initial cost of particle communication, but that the computational benefits quickly offset it. The study provided insight into the number of processors required to amortize the additional computational cost of the air particles.

Sensors ◽  
2021 ◽  
Vol 21 (13) ◽  
pp. 4496
Author(s):  
Vlad Pandelea ◽  
Edoardo Ragusa ◽  
Tommaso Apicella ◽  
Paolo Gastaldo ◽  
Erik Cambria

Emotion recognition, among other natural language processing tasks, has greatly benefited from the use of large transformer models. Deploying these models on resource-constrained devices, however, is a major challenge due to their computational cost. In this paper, we show that the combination of large transformers, as high-quality feature extractors, and simple hardware-friendly classifiers based on linear separators can achieve competitive performance while allowing real-time inference and fast training. Various solutions including batch and Online Sequential Learning are analyzed. Additionally, our experiments show that latency and performance can be further improved via dimensionality reduction and pre-training, respectively. The resulting system is implemented on two types of edge device, namely an edge accelerator and two smartphones.


2020 ◽  
Vol 38 (3-4) ◽  
pp. 1-31
Author(s):  
Burcu Canakci ◽  
Robbert Van Renesse

Scaling Byzantine Fault Tolerant (BFT) systems in terms of membership is important for secure applications with large participation such as blockchains. While traditional protocols have low latency, they cannot handle many processors. Conversely, blockchains often have hundreds to thousands of processors to increase robustness, but they typically have high latency or energy costs. We describe various sources of unscalability in BFT consensus protocols. To improve performance, many BFT protocols optimize the “normal case,” where there are no failures. This can be done in a modular fashion by wrapping existing BFT protocols with a building block that we call alliance . In normal case executions, alliance can scalably determine if the initial conditions of a BFT consensus protocol predetermine the outcome, obviating running the consensus protocol. We give examples of existing protocols that solve alliance. We show that a solution based on hypercubes and MAC s has desirable scalability and performance in normal case executions, with only a modest overhead otherwise. We provide important optimizations. Finally, we evaluate our solution using the ns3 simulator and show that it scales up to thousands of processors and compare with prior work in various network topologies.


Author(s):  
FÁBIO YTOSHI SHIBAO ◽  
GERALDO CARDOSO DE OLIVEIRA NETO ◽  
FLAVIA CRISTINA DA SILVA ◽  
EDUARDO CABRINI POMPONE

ABSTRACT Purpose: To evaluate the universe of published articles that propose frameworks about the relationship between green supply chain management (GSCM) and performance in the period from 1995 to 2014, in order to propose a conceptual model that can be applied to future studies, considering the green profile besides the practices of GSCM and performance. Originality/gap/relevance/implications: The investigation revealed a lack of relationship among the organizations' profile, its environmental, economic and operational performance and GSCM practices. Key methodological aspects: The relationship among constructs was established through bibliometric analysis obtained in the models/frameworks of GSCM practices and performance extracted from the databases "ProQuest", "EBSCO", "JSTOR", "Web of Science" and "Scopus". Further, the content analysis and network analysis were then performed. Summary of key results: GSCM internal and external practices, environmental performance, economic performance and operational performance were revealed as main topics addressed in GSCM. Moreover, it was noted that studies on internal practices prevailed over those addressed to other practices. Key considerations/conclusions: The models studied did not consider whether the corporate green profile could improve the performance of the organization. Therefore, they did not simultaneously measure environmental, economic and operational performance. It was concluded that the addition of the green profile in conjunction with GSCM practices and performance allows for a more in-depth analysis of the degree of a company's involvement with GSCM, as well as its intended objectives and results achieved in the future.


2010 ◽  
Vol 10 (9) ◽  
pp. 21697-21720 ◽  
Author(s):  
T. Nieminen ◽  
P. Paasonen ◽  
H. E. Manninen ◽  
V.-M. Kerminen ◽  
M. Kulmala

Abstract. Atmospheric ions participate in the formation of new atmospheric aerosol particles, yet their exact role in this process has remained unclear. Here we derive a new simple parameterization for ion-induced nucleation or, more precisely, for the formation rate of charged 2-nm particles. The parameterization is semi-empirical in the sense that it is based on comprehensive results of one-year-long atmospheric cluster and particle measurements in the size range ∼1–42 nm within the EUCAARI (European Integrated project on Aerosol Cloud Climate and Air Quality interactions) project. Data from 12 field sites across Europe measured with different types of air ion and cluster mobility spectrometers were used in our analysis, with more in-depth analysis made using data from four stations with concomitant sulphuric acid measurements. The parameterization was given in two slightly different forms: a more accurate one that requires information on sulfuric acid and nucleating organic vapor concentrations, and a simpler one in which this information is replaced with the global radiation intensity. In principle, these new parameterizations are applicable to all large-scale atmospheric models containing size-resolved aerosol microphysics.


2020 ◽  
Vol 12 (6) ◽  
pp. 2121
Author(s):  
Rosiberto Salustiano Silva Junior ◽  
Bruno César Teixeira Cardoso ◽  
Hugo Cainã Ferreira Monteiro ◽  
Ewerton Hallan de Lima Silva

Sendo as diferentes atividades econômicas fortemente influenciadas pela condição do tempo, faz-se necessário antever com dias de antecedência a situação meteorológica favorável ou não para o cotidiano da sociedade. E os modelos atmosféricos são ferramentas amplamente utilizados para avaliar o estado futuro da atmosfera, neste contexto, avaliar a precisão das previsões realizadas por estas ferramentas, tem sido cada fez mais recorrente. Neste trabalho foi utilizado o modelo atmosférico WRF (Weather Research and Forecasting) para realizar previsões diárias com duração de 72h, durante o período de 10 a 19 de julho de 2017 para a cidade de Maceió/AL. Para validar as previsões foram utilizados os dados observados da estação meteorológica automática do INMET (Instituto Nacional de Meteorologia). Para este estudo também foi proposto a atualização da topografia e uso do solo da área de estudo em questão, que gerou melhorias nas comparações realizadas para todas as variáveis analisadas, em destaque a previsão da variável pressão atmosférica, quando atualizada a topografia houve sensíveis melhorias nos indicadores estatísticos em comparação aos demais testes que não contaram com mesma atualização. Além disso, as análises estatísticas e os gráficos apresentados comprovam que o modelo previu melhor para 24h do que para 48h e nesta sequência melhor que 72h, ou seja, existiu a depreciação das previsões com o aumento da duração das previsões. Study of the Efficiency of the Short-Term Numerical Forecast for the City of Maceió / Al, Using the WRF ModelA B S T R A C TThe different economic activities are strongly influenced by the condition of the weather, it is necessary to forecast with days in advance the meteorological situation favorable or not for the daily life of the society. The atmospheric models are tools widely used to assess the future state of the atmosphere, in this context, assess the accuracy of the forecasts made by these tools, has been each made more recurrent. In this work the atmospheric model WRF (Weather Research and Forecasting) was used to make daily forecasts with a duration of 72h during the period from July 10 to 19, 2017 for the city of Maceió / AL, to validate the forecasts were used the observed data of the INMET (National Meteorological Institute) automatic weather station. For this study it was also proposed to update the topography and land user of the study area, which generated improvements in the comparisons made for all variables analyzed, in particular the prediction of the variable atmospheric pressure, when updated the topography there were sensible improvements in statistical indicators compared to the other tests that did not have the same update. In addition, the statistical analyzes and the graphs presented show that the model predicted better for 24h than for 48h and in this sequence better than 72h, that is, there was depreciation of the forecasts with the increase of the forecast duration.Keywords: Weather Forecast, Atmospheric Model, Topography, Land User.


Author(s):  
Sheikh Usman Yousaf ◽  
Bushra Usman ◽  
Muhammad Akram

Stress may hinder the efficiency and performance of individuals. However, little attention has been given to academic stress especially stress experienced by doctoral level university students. Understanding and comprehending the causes of their stress and relevant coping strategies is indeed essential for their better performance. Hence, to address this gap, the purpose of the study was to explore the stressors produced by academic environment and the stress coping strategies adopted by doctoral scholars. Unit of analysis were the individuals enrolled in doctoral studies at the Business School of University Kebangsaan, Malaysia. In-depth analysis of eight doctoral level students revealed that they, in general, share the same experiences and adopt similar coping strategies as were reported to have been experienced and adopted by students of other disciplines (i.e., nursing or psychology students). However, a lack of ability to manage information, information ambiguity and ambiguity regarding quality of one's own work emerged as the major stressors in this study, which have not previously been commonly highlighted by past researches. This study, therefore, reveals that information collection, scarcity of information resources, information ambiguity and work related ambiguity are major stressors for doctoral students. Further, it is also identified that social support, problem diversion, effective information management and time management are significant stress coping techniques. The implications and future recommendation are also discussed in the paper.


2021 ◽  
Author(s):  
Peter T. La Follette ◽  
Adriaan J. Teuling ◽  
Nans Addor ◽  
Martyn Clark ◽  
Koen Jansen ◽  
...  

Abstract. Hydrological models are usually systems of nonlinear differential equations for which no analytical solutions exist and thus rely on approximate numerical solutions. While some studies have investigated the relationship between numerical method choice and model error, the extent to which extreme precipitation like that observed during hurricanes Harvey and Katrina impacts numerical error of hydrological models is still unknown. This knowledge is relevant in light of climate change, where many regions will likely experience more intense precipitation events. In this experiment, a large number of hydrographs is generated with the modular modeling framework FUSE, using eight numerical techniques across a variety of forcing datasets. Multiple model structures, parameter sets, and initial conditions are incorporated for generality. The computational expense and numerical error associated with each hydrograph were recorded. It was found that numerical error (root mean square error) usually increases with precipitation intensity and decreases with event duration. Some numerical methods constrain errors much more effectively than others, sometimes by many orders of magnitude. Of the tested numerical methods, a second-order adaptive explicit method is found to be the most efficient because it has both low numerical error and low computational cost. A basic literature review indicates that many popular modeling codes use numerical techniques that were suggested by this experiment to be sub-optimal. We conclude that relatively large numerical errors might be common in current models, and because these will likely become larger as the climate changes, we advocate for the use of low cost, low error numerical methods.


2018 ◽  
Vol 18 (4) ◽  
pp. 997-1012 ◽  
Author(s):  
Émilie Bresson ◽  
Philippe Arbogast ◽  
Lotfi Aouf ◽  
Denis Paradis ◽  
Anna Kortcheva ◽  
...  

Abstract. Winds, waves and storm surges can inflict severe damage in coastal areas. In order to improve preparedness for such events, a better understanding of storm-induced coastal flooding episodes is necessary. To this end, this paper highlights the use of atmospheric downscaling techniques in order to improve wave and storm surge hindcasts. The downscaling techniques used here are based on existing European Centre for Medium-Range Weather Forecasts reanalyses (ERA-20C, ERA-40 and ERA-Interim). The results show that the 10 km resolution data forcing provided by a downscaled atmospheric model gives a better wave and surge hindcast compared to using data directly from the reanalysis. Furthermore, the analysis of the most extreme mid-latitude cyclones indicates that a four-dimensional blending approach improves the whole process, as it assimilates more small-scale processes in the initial conditions. Our approach has been successfully applied to ERA-20C (the 20th century reanalysis).


2021 ◽  
Vol 15 (4) ◽  
pp. 118-131
Author(s):  
Sadiq A. Mehdi

In this paper, a novel four-dimensional chaotic system has been created, which has characteristics such as high sensitivity to the initial conditions and parameters. It also has two a positive Lyapunov exponents. This means the system is hyper chaotic. In addition, a new algorithm was suggested based on which they constructed an image cryptosystem. In the permutation stage, the pixel positions are scrambled via a chaotic sequence sorting. In the substitution stage, pixel values are mixed with a pseudorandom sequence generated from the 4D chaotic system using XOR operation. A simulation has been conducted to evaluate the algorithm, using the standardized tests such as information entropy, histogram, number of pixel change rate, unified average change intensity, and key space. Experimental results and performance analyses demonstrate that the proposed encryption algorithm achieves high security and efficiency.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Ana Vitória Lachowski Volochtchuk ◽  
Higor Leite

PurposeThe healthcare system has been under pressure to provide timely and quality healthcare. The influx of patients in the emergency departments (EDs) is testing the capacity of the system to its limit. In order to increase EDs' capacity and performance, healthcare managers and practitioners are adopting process improvement (PI) approaches in their operations. Thus, this study aims to identify the main PI approaches implemented in EDs, as well as the benefits and barriers to implement these approaches.Design/methodology/approachThe study is based on a rigorous systematic literature review of 115 papers. Furthermore, under the lens of thematic analysis, the authors present the descriptive and prescriptive findings.FindingsThe descriptive analysis found copious information related to PI approaches implemented in EDs, such as main PIs used in EDs, type of methodological procedures applied, as well as a set of barriers and benefits. Aiming to provide an in-depth analysis and prescriptive results, the authors carried out a thematic analysis that found underlying barriers (e.g. organisational, technical and behavioural) and benefits (e.g. for patients, the organisation and processes) of PI implementation in EDs.Originality/valueThe authors contribute to knowledge by providing a comprehensive review of the main PI methodologies applied in EDs, underscoring the most prominent ones. This study goes beyond descriptive studies that identify lists of barriers and benefits, and instead the authors categorize prescriptive elements that influence these barriers and benefits. Finally, this study raises discussions about the behavioural influence of patients and medical staff on the implementation of PI approaches.


Sign in / Sign up

Export Citation Format

Share Document