scholarly journals The Analyst Model Performance Multi-tier for Increase of Efficiency Virtual Machine In Moodle Application

Author(s):  
Raka Yusuf ◽  
Harni Kusniyati

The use of multi-tier schema has become general phenomena in moodle application on web base in nowadays. Usually is layer presentation on the first tier. On the second tier of the application layer and layer data on the third tier. This is modelling on this paper to looking application performance on web base who is using three-tier model and the impact of value amount in the server who is controlling of each on the tier for the performance. The example in the case who is used for this modelling is the application of Moodle for the college courses in XYZ University. The framework of modelling who is used is Simply which is the simulation framework activity base on the district is using phyton language programming. The simulation on simply can using to a certain interval timer or unlimited (the ideal until for the work all is serviced). The results of the simulation are showing the best obtained by using a single server on each of layer by optimizing the speed of process on the server.

Author(s):  
Amit Sharma

Distributed Denial of Service attacks are significant dangers these days over web applications and web administrations. These assaults pushing ahead towards application layer to procure furthermore, squander most extreme CPU cycles. By asking for assets from web benefits in gigantic sum utilizing quick fire of solicitations, assailant robotized programs use all the capacity of handling of single server application or circulated environment application. The periods of the plan execution is client conduct checking and identification. In to beginning with stage by social affair the data of client conduct and computing individual user’s trust score will happen and Entropy of a similar client will be ascertained. HTTP Unbearable Load King (HULK) attacks are also evaluated. In light of first stage, in recognition stage, variety in entropy will be watched and malevolent clients will be recognized. Rate limiter is additionally acquainted with stop or downsize serving the noxious clients. This paper introduces the FAÇADE layer for discovery also, hindering the unapproved client from assaulting the framework.


2017 ◽  
Vol 68 (1) ◽  
pp. 175-179
Author(s):  
Oana Roxana Chivu ◽  
Augustin Semenescu ◽  
Claudiu Babis ◽  
Catalin Amza ◽  
Gabriel Iacobescu ◽  
...  

Rainfall is a major component of the environment and the main source of the air purification becouse of many pollutants increases who have the most varied sources: various human activities including industry and agriculture, and some household duties. Air purification by means of precipitation is achieved by numerous highly complex mechanisms. The final products of degradation of the pollutant in the air, which are generally harmless, can be reacted with each other in the presence of water, giving rise to the final compounds with a high toxicity. Thus, exhaust, mobile sources of noxious almost identical to those specific activities in the industrial processing of oil, contain lead which is the ideal catalyst for converting SO2 to sulfuric acid in the presence of rainwater, with all the disadvantages that they create. This paper will present an experimental research oabout how rainfall water quality is influenced by the activity of the industrial processing of oil, in a chemical plant in Constanta County.


2012 ◽  
Vol 46 (3) ◽  
pp. 220-224 ◽  
Author(s):  
Sherry L Walters ◽  
Cristobal Jose Torres-Urbano ◽  
Lee Chichester ◽  
Robert E Rose

The ideal animal model would contribute no confounding variables in experimental science. Variables affect experimental design resulting in increased animal use or repeated studies. We demonstrated a simple refinement which may reduce the number of animals used experimentally while simultaneously improving animal welfare. The objective of this study was to determine if the presence of a hut was an impact on physiological stress levels, as determined by faecal cortisol concentration, during a routine four-day acclimatization period of newly received male Hartley-Outbred guineapigs. We hypothesized that those animals provided with huts would have decreased physiological stress compared with animals not provided with huts. We examined this effect within both paired and single-housed animals. A between-subjects one-way analysis of variance revealed that pair-housed animals with a hut had significantly lower faecal cortisol concentration than pair-housed animals without a hut and the presence and absence of a hut had no significant impact on faecal cortisol concentration in single-housed animals. These findings show that presence of a hut is beneficial in reducing physiological stress when pair housing male guineapigs and does not appear to have an impact when single housing male guineapigs. In addition, we have shown that faecal cortisol, and therefore physiological stress, is still increasing on study day 4 suggesting a longer acclimatization period is necessary. A simple refinement in housing environment and acclimatization time can both reduce the number of animals used experimentally and improve animal welfare.


Geosciences ◽  
2021 ◽  
Vol 11 (8) ◽  
pp. 322
Author(s):  
Evelina Volpe ◽  
Luca Ciabatta ◽  
Diana Salciarini ◽  
Stefania Camici ◽  
Elisabetta Cattoni ◽  
...  

The development of forecasting models for the evaluation of potential slope instability after rainfall events represents an important issue for the scientific community. This topic has received considerable impetus due to the climate change effect on territories, as several studies demonstrate that an increase in global warming can significantly influence the landslide activity and stability conditions of natural and artificial slopes. A consolidated approach in evaluating rainfall-induced landslide hazard is based on the integration of rainfall forecasts and physically based (PB) predictive models through deterministic laws. However, considering the complex nature of the processes and the high variability of the random quantities involved, probabilistic approaches are recommended in order to obtain reliable predictions. A crucial aspect of the stochastic approach is represented by the definition of appropriate probability density functions (pdfs) to model the uncertainty of the input variables as this may have an important effect on the evaluation of the probability of failure (PoF). The role of the pdf definition on reliability analysis is discussed through a comparison of PoF maps generated using Monte Carlo (MC) simulations performed over a study area located in the Umbria region of central Italy. The study revealed that the use of uniform pdfs for the random input variables, often considered when a detailed geotechnical characterization for the soil is not available, could be inappropriate.


2021 ◽  
Vol 444 ◽  
pp. 109453
Author(s):  
Camille Van Eupen ◽  
Dirk Maes ◽  
Marc Herremans ◽  
Kristijn R.R. Swinnen ◽  
Ben Somers ◽  
...  

2021 ◽  
Vol 22 (1) ◽  
Author(s):  
An Zheng ◽  
Michael Lamkin ◽  
Yutong Qiu ◽  
Kevin Ren ◽  
Alon Goren ◽  
...  

Abstract Background A major challenge in evaluating quantitative ChIP-seq analyses, such as peak calling and differential binding, is a lack of reliable ground truth data. Accurate simulation of ChIP-seq data can mitigate this challenge, but existing frameworks are either too cumbersome to apply genome-wide or unable to model a number of important experimental conditions in ChIP-seq. Results We present ChIPs, a toolkit for rapidly simulating ChIP-seq data using statistical models of key experimental steps. We demonstrate how ChIPs can be used for a range of applications, including benchmarking analysis tools and evaluating the impact of various experimental parameters. ChIPs is implemented as a standalone command-line program written in C++ and is available from https://github.com/gymreklab/chips. Conclusions ChIPs is an efficient ChIP-seq simulation framework that generates realistic datasets over a flexible range of experimental conditions. It can serve as an important component in various ChIP-seq analyses where ground truth data are needed.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Steve Kanters ◽  
Mohammad Ehsanul Karim ◽  
Kristian Thorlund ◽  
Aslam Anis ◽  
Nick Bansback

Abstract Background The use of individual patient data (IPD) in network meta-analyses (NMA) is rapidly growing. This study aimed to determine, through simulations, the impact of select factors on the validity and precision of NMA estimates when combining IPD and aggregate data (AgD) relative to using AgD only. Methods Three analysis strategies were compared via simulations: 1) AgD NMA without adjustments (AgD-NMA); 2) AgD NMA with meta-regression (AgD-NMA-MR); and 3) IPD-AgD NMA with meta-regression (IPD-NMA). We compared 108 parameter permutations: number of network nodes (3, 5 or 10); proportion of treatment comparisons informed by IPD (low, medium or high); equal size trials (2-armed with 200 patients per arm) or larger IPD trials (500 patients per arm); sparse or well-populated networks; and type of effect-modification (none, constant across treatment comparisons, or exchangeable). Data were generated over 200 simulations for each combination of parameters, each using linear regression with Normal distributions. To assess model performance and estimate validity, the mean squared error (MSE) and bias of treatment-effect and covariate estimates were collected. Standard errors (SE) and percentiles were used to compare estimate precision. Results Overall, IPD-NMA performed best in terms of validity and precision. The median MSE was lower in the IPD-NMA in 88 of 108 scenarios (similar results otherwise). On average, the IPD-NMA median MSE was 0.54 times the median using AgD-NMA-MR. Similarly, the SEs of the IPD-NMA treatment-effect estimates were 1/5 the size of AgD-NMA-MR SEs. The magnitude of superior validity and precision of using IPD-NMA varied across scenarios and was associated with the amount of IPD. Using IPD in small or sparse networks consistently led to improved validity and precision; however, in large/dense networks IPD tended to have negligible impact if too few IPD were included. Similar results also apply to the meta-regression coefficient estimates. Conclusions Our simulation study suggests that the use of IPD in NMA will considerably improve the validity and precision of estimates of treatment effect and regression coefficients in the most NMA IPD data-scenarios. However, IPD may not add meaningful validity and precision to NMAs of large and dense treatment networks when negligible IPD are used.


Sensors ◽  
2021 ◽  
Vol 21 (15) ◽  
pp. 4994
Author(s):  
You Wang ◽  
Hengyang Wang ◽  
Huiyan Li ◽  
Asif Ullah ◽  
Ming Zhang ◽  
...  

Based on surface electromyography (sEMG), a novel recognition method to distinguish six types of human primary taste sensations was developed, and the recognition accuracy was 74.46%. The sEMG signals were acquired under the stimuli of no taste substance, distilled vinegar, white granulated sugar, instant coffee powder, refined salt, and Ajinomoto. Then, signals were preprocessed with the following steps: sample augments, removal of trend items, high-pass filter, and adaptive power frequency notch. Signals were classified with random forest and the classifier gave a five-fold cross-validation accuracy of 74.46%, which manifested the feasibility of the recognition task. To further improve the model performance, we explored the impact of feature dimension, electrode distribution, and subject diversity. Accordingly, we provided an optimized feature combination that reduced the number of feature types from 21 to 4, a preferable selection of electrode positions that reduced the number of channels from 6 to 4, and an analysis of the relation between subject diversity and model performance. This study provides guidance for further research on taste sensation recognition with sEMG.


2011 ◽  
Vol 57 (202) ◽  
pp. 367-381 ◽  
Author(s):  
Francesca Pellicciotti ◽  
Thomas Raschle ◽  
Thomas Huerlimann ◽  
Marco Carenzo ◽  
Paolo Burlando

AbstractWe explore the robustness and transferability of parameterizations of cloud radiative forcing used in glacier melt models at two sites in the Swiss Alps. We also look at the rationale behind some of the most commonly used approaches, and explore the relationship between cloud transmittance and several standard meteorological variables. The 2 m air-temperature diurnal range is the best predictor of variations in cloud transmittance. However, linear and exponential parameterizations can only explain 30–50% of the observed variance in computed cloud transmittance factors. We examine the impact of modelled cloud transmittance factors on both solar radiation and ablation rates computed with an enhanced temperature-index model. The melt model performance decreases when modelled radiation is used, the reduction being due to an underestimation of incoming solar radiation on clear-sky days. The model works well under overcast conditions. We also seek alternatives to the use of in situ ground data. However, outputs from an atmospheric model (2.2 km horizontal resolution) do not seem to provide an alternative to the parameterizations of cloud radiative forcing based on observations of air temperature at glacier automatic weather stations. Conversely, the correct definition of overcast conditions is important.


2008 ◽  
Vol 45 (02) ◽  
pp. 77-88
Author(s):  
Erlend Hovland

Simplified technical and economical analysis of a total of 12 different ship designs has been performed and compared with operational profiles of typical North Sea Diving and Construction vessels. Evaluation of what impact changes to the operational profile has on the best-suited vessel for the operation has also been performed. The paper focuses on optimizing the design to achieve a maximum of days in operation working day. Nomenclature


Sign in / Sign up

Export Citation Format

Share Document