scholarly journals A Study on the Development of Machine-Learning Based Load Transfer Detection Algorithm for Distribution Planning

Energies ◽  
2020 ◽  
Vol 13 (17) ◽  
pp. 4358
Author(s):  
Jun-Hyeok Kim ◽  
Byung-Sung Lee ◽  
Chul-Hwan Kim

Distribution planning refers to the act of estimating the risks of distribution systems that may arise in the future and establishing investment plans to cope with them. Forecasted loads are one of the most typical variables used to analyze the risk of the distribution system, thus the efficiency of distribution planning may vary depending on its accuracy. For these reasons, a lot of studies are also being conducted to perform load prediction by incorporating the latest methods, such as machine learning (ML). However, the unchangeable fact is that no matter what prediction method is used, the accuracy and reliability of the predicted load can vary depending on the reliability of the data used. In particular, the detection of temporary load increases, due to load transfer that can occur frequently in the distribution system are essential for securing high-quality data. Therefore, in this study, a LSTM (Long Short-Term Memory) based load transfer detection model was proposed, and the appropriateness and reliability of the proposed method were analyzed by comparing actual planned load transfer data with the estimated load transfer results from the proposed model. It was also shown that the proposed model can improve the efficiency and reliability of the distribution planning by reasonably removing load variations, due to load transfer.

Internet of Things (IoT) is one of the fast-growing technology paradigms used in every sectors, where in the Quality of Service (QoS) is a critical component in such systems and usage perspective with respect to ProSumers (producer and consumers). Most of the recent research works on QoS in IoT have used Machine Learning (ML) techniques as one of the computing methods for improved performance and solutions. The adoption of Machine Learning and its methodologies have become a common trend and need in every technologies and domain areas, such as open source frameworks, task specific algorithms and using AI and ML techniques. In this work we propose an ML based prediction model for resource optimization in the IoT environment for QoS provisioning. The proposed methodology is implemented by using a multi-layer neural network (MNN) for Long Short Term Memory (LSTM) learning in layered IoT environment. Here the model considers the resources like bandwidth and energy as QoS parameters and provides the required QoS by efficient utilization of the resources in the IoT environment. The performance of the proposed model is evaluated in a real field implementation by considering a civil construction project, where in the real data is collected by using video sensors and mobile devices as edge nodes. Performance of the prediction model is observed that there is an improved bandwidth and energy utilization in turn providing the required QoS in the IoT environment.


2017 ◽  
Vol 2017 ◽  
pp. 1-22 ◽  
Author(s):  
Jihyun Kim ◽  
Thi-Thu-Huong Le ◽  
Howon Kim

Monitoring electricity consumption in the home is an important way to help reduce energy usage. Nonintrusive Load Monitoring (NILM) is existing technique which helps us monitor electricity consumption effectively and costly. NILM is a promising approach to obtain estimates of the electrical power consumption of individual appliances from aggregate measurements of voltage and/or current in the distribution system. Among the previous studies, Hidden Markov Model (HMM) based models have been studied very much. However, increasing appliances, multistate of appliances, and similar power consumption of appliances are three big issues in NILM recently. In this paper, we address these problems through providing our contributions as follows. First, we proposed state-of-the-art energy disaggregation based on Long Short-Term Memory Recurrent Neural Network (LSTM-RNN) model and additional advanced deep learning. Second, we proposed a novel signature to improve classification performance of the proposed model in multistate appliance case. We applied the proposed model on two datasets such as UK-DALE and REDD. Via our experimental results, we have confirmed that our model outperforms the advanced model. Thus, we show that our combination between advanced deep learning and novel signature can be a robust solution to overcome NILM’s issues and improve the performance of load identification.


Information ◽  
2021 ◽  
Vol 12 (9) ◽  
pp. 374
Author(s):  
Babacar Gaye ◽  
Dezheng Zhang ◽  
Aziguli Wulamu

With the extensive availability of social media platforms, Twitter has become a significant tool for the acquisition of peoples’ views, opinions, attitudes, and emotions towards certain entities. Within this frame of reference, sentiment analysis of tweets has become one of the most fascinating research areas in the field of natural language processing. A variety of techniques have been devised for sentiment analysis, but there is still room for improvement where the accuracy and efficacy of the system are concerned. This study proposes a novel approach that exploits the advantages of the lexical dictionary, machine learning, and deep learning classifiers. We classified the tweets based on the sentiments extracted by TextBlob using a stacked ensemble of three long short-term memory (LSTM) as base classifiers and logistic regression (LR) as a meta classifier. The proposed model proved to be effective and time-saving since it does not require feature extraction, as LSTM extracts features without any human intervention. We also compared our proposed approach with conventional machine learning models such as logistic regression, AdaBoost, and random forest. We also included state-of-the-art deep learning models in comparison with the proposed model. Experiments were conducted on the sentiment140 dataset and were evaluated in terms of accuracy, precision, recall, and F1 Score. Empirical results showed that our proposed approach manifested state-of-the-art results by achieving an accuracy score of 99%.


In this paper we propose a novel supervised machine learning model to predict the polarity of sentiments expressed in microblogs. The proposed model has a stacked neural network structure consisting of Long Short Term Memory (LSTM) and Convolutional Neural Network (CNN) layers. In order to capture the long-term dependencies of sentiments in the text ordering of a microblog, the proposed model employs an LSTM layer. The encodings produced by the LSTM layer are then fed to a CNN layer, which generates localized patterns of higher accuracy. These patterns are capable of capturing both local and global long-term dependences in the text of the microblogs. It was observed that the proposed model performs better and gives improved prediction accuracy when compared to semantic, machine learning and deep neural network approaches such as SVM, CNN, LSTM, CNN-LSTM, etc. This paper utilizes the benchmark Stanford Large Movie Review dataset to show the significance of the new approach. The prediction accuracy of the proposed approach is comparable to other state-of-art approaches.


2021 ◽  
Author(s):  
Amitkumar Dadhania

Large-scale integration of Wind Generators (WGs) with distribution systems is underway right across the globe in a drive to harness green energy. The Doubly Fed Induction Generator (DFIG) is an important type of WG due to its robustness and versatility. Its accurate and efficient modeling is very important in distribution systems planning and analysis studies, as the older approximate representation method (the constant PQ model) is no longer sufficient given the scale of integration of WGs. This thesis proposes a new three-phase model for the DFIG, compatible with unbalanced three-phase distribution systems, by deriving an analytical representation of its three major components, namely the wind turbine, the voltage source converter, and the wound-rotor induction machine. The proposed model has a set of nonlinear equations that yields the total three-phase active and reactive powers injected into the grid by the DFIG as a function of the grid voltage and wind turbine parameters. This proposed model is integrated with a three-phased unbalanced power flow method and reported in this thesis. The proposed method opens up a new way to conduct power flow studies on unbalanced distribution systems with WGs. The proposed DFIG model is verified using Matlab-Simulink. IEEE 37-bus test system data from the IEEE Distribution System sub-committee is used to benchmark the results of the power flow method.


2021 ◽  
Vol 11 (17) ◽  
pp. 7940
Author(s):  
Mohammed Al-Sarem ◽  
Abdullah Alsaeedi ◽  
Faisal Saeed ◽  
Wadii Boulila ◽  
Omair AmeerBakhsh

Spreading rumors in social media is considered under cybercrimes that affect people, societies, and governments. For instance, some criminals create rumors and send them on the internet, then other people help them to spread it. Spreading rumors can be an example of cyber abuse, where rumors or lies about the victim are posted on the internet to send threatening messages or to share the victim’s personal information. During pandemics, a large amount of rumors spreads on social media very fast, which have dramatic effects on people’s health. Detecting these rumors manually by the authorities is very difficult in these open platforms. Therefore, several researchers conducted studies on utilizing intelligent methods for detecting such rumors. The detection methods can be classified mainly into machine learning-based and deep learning-based methods. The deep learning methods have comparative advantages against machine learning ones as they do not require preprocessing and feature engineering processes and their performance showed superior enhancements in many fields. Therefore, this paper aims to propose a Novel Hybrid Deep Learning Model for Detecting COVID-19-related Rumors on Social Media (LSTM–PCNN). The proposed model is based on a Long Short-Term Memory (LSTM) and Concatenated Parallel Convolutional Neural Networks (PCNN). The experiments were conducted on an ArCOV-19 dataset that included 3157 tweets; 1480 of them were rumors (46.87%) and 1677 tweets were non-rumors (53.12%). The findings of the proposed model showed a superior performance compared to other methods in terms of accuracy, recall, precision, and F-score.


Crop diseases reduce the yield of the crop or may even kill it. Over the past two years, as per the I.C.A.R, the production of chilies in the state of Goa has reduced drastically due to the presence of virus. Most of the plants flower very less or stop flowering completely. In rare cases when a plant manages to flower, the yield is substantially low. Proposed model detects the presence of disease in crops by examining the symptoms. The model uses an object detection algorithm and supervised image recognition and feature extraction using convolutional neural network to classify crops as infected or healthy. Google machine learning libraries, TensorFlow and Keras are used to build neural network models. An Android application is developed around the model for the ease of using the disease detection system.


2020 ◽  
Vol 2020 ◽  
pp. 1-14 ◽  
Author(s):  
Yun Jing ◽  
Si-Ye Guo ◽  
Xuan Wang ◽  
Fang-Qiu Chen

In recent years, with the gradual networking of high-speed railways in China, the existing railway transportation capacity has been released. In order to improve transportation capacity, railway freight transportation enterprises companies have gradually shifted the transportation of goods from dedicated freight lines to passenger-cargo lines. In terms of the organization form of collection and distribution, China has a complete research system for heavy-haul railway collection and distribution, but the research on the integration of collection and distribution of the ordinary-speed railway freight has not been completed. This paper combines the theories of the integration of collection and distribution theory, coordination theory, and coupling theory and incorporates the machine learning fuzzy mathematics to construct an “Entropy-TOPSIS Coupling Development Degree Model” for dynamic intelligent quantitative analysis of the synergy of railway freight collection and distribution systems. Finally, we take the Tongchuan Depot of “China Railway Xi’an Group Co., Ltd.” as a research object to construct a target system and use the intelligent information acquisition system to collect basic data. The analysis results show that through the coordinated control of the freight collection and distribution system, the coordination between the subsystems of the integrated freight collection and distribution system is increased by 5.94%, which verifies the feasibility of the model in the quantitative improvement of the integration of collection and distribution system. It provides a new method for the research of integrated development of railway freight collection and distribution.


Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-12
Author(s):  
Fang Yao ◽  
Wei Liu ◽  
Xingyong Zhao ◽  
Li Song

This paper develops an integrated machine learning and enhanced statistical approach for wind power interval forecasting. A time-series wind power forecasting model is formulated as the theoretical basis of our method. The proposed model takes into account two important characteristics of wind speed: the nonlinearity and the time-changing distribution. Based on the proposed model, six machine learning regression algorithms are employed to forecast the prediction interval of the wind power output. The six methods are tested using real wind speed data collected at a wind station in Australia. For wind speed forecasting, the long short-term memory (LSTM) network algorithm outperforms other five algorithms. In terms of the prediction interval, the five nonlinear algorithms show superior performances. The case studies demonstrate that combined with an appropriate nonlinear machine learning regression algorithm, the proposed methodology is effective in wind power interval forecasting.


Mathematics ◽  
2020 ◽  
Vol 8 (6) ◽  
pp. 887 ◽  
Author(s):  
Alexandru Predescu ◽  
Ciprian-Octavian Truică ◽  
Elena-Simona Apostol ◽  
Mariana Mocanu ◽  
Ciprian Lupu

Water distribution is fundamental to modern society, and there are many associated challenges in the context of large metropolitan areas. A multi-domain approach is required for designing modern solutions for the existing infrastructure, including control and monitoring systems, data science and Machine Learning. Considering the large scale water distribution networks in metropolitan areas, machine and deep learning algorithms can provide improved adaptability for control applications. This paper presents a monitoring and control machine learning-based architecture for a smart water distribution system. Automated test scenarios and learning methods are proposed and designed to predict the network configuration for a modern implementation of a multiple model control supervisor with increased adaptability to changing operating conditions. The high-level processing and components for smart water distribution systems are supported by the smart meters, providing real-time data, push-based and decoupled software architectures and reactive programming.


Sign in / Sign up

Export Citation Format

Share Document