scholarly journals Electric Load Forecast Using Combined Models with HP Filter-SARIMA and ARMAX Optimized by Regression Analysis Algorithm

2015 ◽  
Vol 2015 ◽  
pp. 1-14
Author(s):  
Cui Herui ◽  
Peng Xu ◽  
Mu Yupei

Electric load in summer has a significant cyclical trend with temperature effects. In general, the parameters of the SARIMA and the SMA turn out to be nonsignificant in most cases. To address this issue, the hybrid time series model is utilized to extract the spectrum sequences with different frequencies. The original electric load series are first decomposed into the trend sequence “G” and the cycle sequence “C.” After that, a revised ARMAX model is proposed to deal with the two divided sequences. Finally, the combined models are tested by case study. The case study on electric load forecast in one city from China shows that the proposed model outperforms other four comparative models in terms of prediction accuracy. It proves that the combined model proposed by the authors is more accurate than those based on a single forecasting method.

Forecasting ◽  
2021 ◽  
Vol 3 (1) ◽  
pp. 91-101
Author(s):  
Alfredo Nespoli ◽  
Emanuele Ogliari ◽  
Silvia Pretto ◽  
Michele Gavazzeni ◽  
Sonia Vigani ◽  
...  

Accurate forecast of aggregate end-users electric load profiles is becoming a hot topic in research for those main issues addressed in many fields such as the electricity services market. Hence, load forecast is an extremely important task which should be understood more in depth. In this research paper, the dependency of the day-ahead load forecast accuracy on the basis of the data typology employed in the training of LSTM has been inspected. A real case study of an Italian industrial load with samples recorded every 15 min for the year 2017 and 2018 was studied. The effect in the load forecast accuracy of different dataset cleaning approaches was investigated. In addition, the Generalised Extreme Studentized Deviate hypothesis testing was introduced to identify the outliers present in the dataset. The populations were constructed on the basis of an autocorrelation analysis that allowed for identifying a weekly correlation of the samples. The accuracy of the prediction obtained from different input dataset has been therefore investigated by calculating the most commonly used error metrics, showing the importance of data processing before employing them for load forecast.


2019 ◽  
Vol 9 (21) ◽  
pp. 4604 ◽  
Author(s):  
Larabi-Marie-Sainte ◽  
Aburahmah ◽  
Almohaini ◽  
Saba

Diabetes is one of the most common diseases worldwide. Many Machine Learning (ML) techniques have been utilized in predicting diabetes in the last couple of years. The increasing complexity of this problem has inspired researchers to explore the robust set of Deep Learning (DL) algorithms. The highest accuracy achieved so far was 95.1% by a combined model CNN-LSTM. Even though numerous ML algorithms were used in solving this problem, there are a set of classifiers that are rarely used or even not used at all in this problem, so it is of interest to determine the performance of these classifiers in predicting diabetes. Moreover, there is no recent survey that has reviewed and compared the performance of all the proposed ML and DL techniques in addition to combined models. This article surveyed all the ML and DL techniques-based diabetes predictions published in the last six years. In addition, one study was developed that aimed to implement those rarely and not used ML classifiers on the Pima Indian Dataset to analyze their performance. The classifiers obtained an accuracy of 68%–74%. The recommendation is to use these classifiers in diabetes prediction and enhance them by developing combined models.


2014 ◽  
Vol 134 (1) ◽  
pp. 9-15 ◽  
Author(s):  
Hisatomo Miyata ◽  
Kazutoshi Miyashita ◽  
Takayuki Endo ◽  
Yuichi Shimasaki ◽  
Tatsuya Iizaka ◽  
...  

Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-15 ◽  
Author(s):  
Tinggui Chen ◽  
Shiwen Wu ◽  
Jianjun Yang ◽  
Guodong Cong ◽  
Gongfa Li

It is common that many roads in disaster areas are damaged and obstructed after sudden-onset disasters. The phenomenon often comes with escalated traffic deterioration that raises the time and cost of emergency supply scheduling. Fortunately, repairing road network will shorten the time of in-transit distribution. In this paper, according to the characteristics of emergency supplies distribution, an emergency supply scheduling model based on multiple warehouses and stricken locations is constructed to deal with the failure of part of road networks in the early postdisaster phase. The detailed process is as follows. When part of the road networks fail, we firstly determine whether to repair the damaged road networks, and then a model of reliable emergency supply scheduling based on bi-level programming is proposed. Subsequently, an improved artificial bee colony algorithm is presented to solve the problem mentioned above. Finally, through a case study, the effectiveness and efficiency of the proposed model and algorithm are verified.


Open Physics ◽  
2021 ◽  
Vol 19 (1) ◽  
pp. 360-374
Author(s):  
Yuan Pei ◽  
Lei Zhenglin ◽  
Zeng Qinghui ◽  
Wu Yixiao ◽  
Lu Yanli ◽  
...  

Abstract The load of the showcase is a nonlinear and unstable time series data, and the traditional forecasting method is not applicable. Deep learning algorithms are introduced to predict the load of the showcase. Based on the CEEMD–IPSO–LSTM combination algorithm, this paper builds a refrigerated display cabinet load forecasting model. Compared with the forecast results of other models, it finally proves that the CEEMD–IPSO–LSTM model has the highest load forecasting accuracy, and the model’s determination coefficient is 0.9105, which is obviously excellent. Compared with other models, the model constructed in this paper can predict the load of showcases, which can provide a reference for energy saving and consumption reduction of display cabinet.


2016 ◽  
Vol 27 (2) ◽  
pp. 129-134 ◽  
Author(s):  
Ronald H. Silverman ◽  
Raksha Urs ◽  
Arindam RoyChoudhury ◽  
Timothy J. Archer ◽  
Marine Gobbe ◽  
...  

Purpose Scanning Scheimpflug provides information regarding corneal thickness and 2-surface topography while arc-scanned high-frequency ultrasound allows depiction of the epithelial and stromal thickness distributions. Both techniques are useful in detection of keratoconus. Our aim was to develop and test a keratoconus classifier combining information from both methods. Methods We scanned 111 normal and 30 clinical keratoconus subjects with Artemis-1 and Pentacam data. After selecting one random eye per subject, we performed stepwise linear discriminant analysis on a dataset combining parameters generated by each method to obtain classification models based on each technique alone and in combination. Results Discriminant analysis resulted in a 4-variable model (R2 = 0.740) based on Artemis data alone and a 4-variable model (R2 = 0.734) using Pentacam data alone. The combined model (R2 = 0.828) consisted of 3 Artemis- and 4 Pentacam-derived variables. The combined model R value was significantly higher than either model alone (p = 0.031, one-tailed). In cross-validation, Artemis had 100% sensitivity and 99.2% specificity, Pentacam had 97.3% sensitivity and 98.0% specificity, and the combined model had 97.3% sensitivity and 100% specificity. Conclusions Pentacam, Artemis, and combined models were all effective in distinguishing normal from clinical keratoconus subjects. From the standpoint of variance explained by the model (R2 values), the combined model was most effective. Application of the model to early and subclinical keratoconus will ultimately be required to assess the effectiveness of the combined approach.


2021 ◽  
Vol 13 (11) ◽  
pp. 6109
Author(s):  
Joanne Lee Picknoll ◽  
Pieter Poot ◽  
Michael Renton

Habitat loss has reduced the available resources for apiarists and is a key driver of poor colony health, colony loss, and reduced honey yields. The biggest challenge for apiarists in the future will be meeting increasing demands for pollination services, honey, and other bee products with limited resources. Targeted landscape restoration focusing on high-value or high-yielding forage could ensure adequate floral resources are available to sustain the growing industry. Tools are currently needed to evaluate the likely productivity of potential sites for restoration and inform decisions about plant selections and arrangements and hive stocking rates, movements, and placements. We propose a new approach for designing sites for apiculture, centred on a model of honey production that predicts how changes to plant and hive decisions affect the resource supply, potential for bees to collect resources, consumption of resources by the colonies, and subsequently, amount of honey that may be produced. The proposed model is discussed with reference to existing models, and data input requirements are discussed with reference to an Australian case study area. We conclude that no existing model exactly meets the requirements of our proposed approach, but components of several existing models could be combined to achieve these needs.


Author(s):  
Shorya Awtar ◽  
Edip Sevincer

Over-constraint is an important concern in mechanism design because it can lead to a loss in desired mobility. In distributed-compliance flexure mechanisms, this problem is alleviated due to the phenomenon of elastic averaging, thus enabling performance-enhancing geometric arrangements that are otherwise unrealizable. The principle of elastic averaging is illustrated in this paper by means of a multi-beam parallelogram flexure mechanism. In a lumped-compliance configuration, this mechanism is prone to over-constraint in the presence of nominal manufacturing and assembly errors. However, with an increasing degree of distributed-compliance, the mechanism is shown to become more tolerant to such geometric imperfections. The nonlinear load-stiffening and elasto-kinematic effects in the constituent beams have an important role to play in the over-constraint and elastic averaging characteristics of this mechanism. Therefore, a parametric model that incorporates these nonlinearities is utilized in predicting the influence of a representative geometric imperfection on the primary motion stiffness of the mechanism. The proposed model utilizes a beam generalization so that varying degrees of distributed compliance are captured using a single geometric parameter.


2018 ◽  
Vol 17 (05) ◽  
pp. 1429-1467 ◽  
Author(s):  
Mohammad Amirkhan ◽  
Hosein Didehkhani ◽  
Kaveh Khalili-Damghani ◽  
Ashkan Hafezalkotob

The issue of efficiency analysis of network and multi-stage systems, as one of the most interesting fields in data envelopment analysis (DEA), has attracted much attention in recent years. A pure serial three-stage (PSTS) process is a specific kind of network in which all the outputs of the first stage are used as the only inputs in the second stage and in addition, all the outputs of the second stage are applied as the only inputs in the third stage. In this paper, a new three-stage DEA model is developed using the concept of three-player Nash bargaining game for PSTS processes. In this model, all of the stages cooperate together to improve the overall efficiency of main decision-making unit (DMU). In contrast to the centralized DEA models, the proposed model of this study provides a unique and fair decomposition of the overall efficiency among all three stages and eliminates probable confusion of centralized models for decomposing the overall efficiency score. Some theoretical aspects of proposed model, including convexity and compactness of feasible region, are discussed. Since the proposed bargaining model is a nonlinear mathematical programming, a heuristic linearization approach is also provided. A numerical example and a real-life case study in supply chain are provided to check the efficacy and applicability of the proposed model. The results of proposed model on both numerical example and real case study are compared with those of existing centralized DEA models in the literature. The comparison reveals the efficacy and suitability of proposed model while the pitfalls of centralized DEA model are also resolved. A comprehensive sensitivity analysis is also conducted on the breakdown point associated with each stage.


Sign in / Sign up

Export Citation Format

Share Document