scholarly journals Machine Learning-Based Management of Electric Vehicles Charging: Towards Highly-Dispersed Fast Chargers

Energies ◽  
2020 ◽  
Vol 13 (20) ◽  
pp. 5429
Author(s):  
Mostafa Shibl ◽  
Loay Ismail ◽  
Ahmed Massoud

Coordinated charging of electric vehicles (EVs) improves the overall efficiency of the power grid as it avoids distribution system overloads, increases power quality, and decreases voltage fluctuations. Moreover, the coordinated charging supports flattening the load profile. Therefore, an effective coordination technique is crucial for the protection of the distribution grid and its components. The substantial power used through charging EVs has undeniable negative impacts on the power grid. Additionally, with the increasing use of EVs, an effective solution for the coordination of EVs charging, particularly when considering the anticipated proliferation of EV fast chargers, is imminently required. In this paper, different machine learning (ML) approaches are compared for the coordination of EVs charging. The ML models can predict the power to be used in EVs charging stations (EVCS). Due to its ability to use historical data to learn and identify patterns for making future decisions with minimal user intervention, ML has been utilized. ML models used in this paper are (1) Decision Tree (DT), (2) Random Forest (RF), (3) Support Vector Machine (SVM), (4) Naïve Bayes (NB), (5) K-Nearest Neighbors (KNN), (6) Deep Neural Networks (DNN), and (7) Long Short-Term Memory (LSTM). These approaches are chosen as they are classifiers known to have the leading results for multiclass classification problems. The results found shed insight on the importance of the techniques used and their high potential in providing a reliable solution for the coordinated charging of EVs, thus improving the performance of the power grid, and reducing power losses and voltage fluctuations. The use of ML provides a less complex method to coordinate EVs, in comparison with conventional optimization techniques such as quadratic programming, and the use of ML is faster as it requires less computational power. LSTM provided the best results with an accuracy of 95% for predicting the most appropriate power rating (PR) for EVCS, followed by RF, DT, DNN, SVM, KNN, and NB. Additionally, LSTM was also the model with the smallest error rate, at a value of ±0.7%, followed by RF, DT, KNN, SVM, DNN, and NB. The results obtained from the LSTM model were similar to the results obtained from past literature using quadratic programming, with the increased speed and simplicity of ML.

Energies ◽  
2021 ◽  
Vol 14 (19) ◽  
pp. 6199
Author(s):  
Mostafa Shibl ◽  
Loay Ismail ◽  
Ahmed Massoud

Electric vehicles (EVs) have gained in popularity over the years. The charging of a high number of EVs harms the distribution system. As a result, increased transformer overloads, power losses, and voltage fluctuations may occur. Thus, management of EVs is required to address these challenges. An EV charging management system based on machine learning (ML) is utilized to route EVs to charging stations to minimize the load variance, power losses, voltage fluctuations, and charging cost whilst considering conventional charging, fast charging, and vehicle-to-grid (V2G) technologies. A number of ML algorithms are contrasted in terms of their performances in optimization since ML has the ability to create accurate future decisions based on historical data, which are Decision Tree (DT), Random Forest (RF), Support Vector Machine (SVM), K-Nearest Neighbours (KNN), Long Short-Term Memory (LSTM) and Deep Neural Networks (DNN). The results verify the reliability of the use of LSTM for the management of EVs to ensure high accuracy. The LSTM model successfully minimizes power losses and voltage fluctuations and achieves peak shaving by flattening the load curve. Furthermore, the charging cost is minimized. Additionally, the efficiency of the management system proved to be robust against the uncertainty of the load data that is used as an input to the ML system.


2021 ◽  
Vol 11 (21) ◽  
pp. 10187
Author(s):  
Yonghyeok Ji ◽  
Seongyong Jeong ◽  
Yeongjin Cho ◽  
Howon Seo ◽  
Jaesung Bang ◽  
...  

Transmission mounted electric drive type hybrid electric vehicles (HEVs) engage/disengage an engine clutch when EV↔HEV mode transitions occur. If this engine clutch is not adequately engaged or disengaged, driving power is not transmitted correctly. Therefore, it is required to verify whether engine clutch engagement/disengagement operates normally in the vehicle development process. This paper studied machine learning-based methods for detecting anomalies in the engine clutch engagement/disengagement process. We trained the various models based on multi-layer perceptron (MLP), long short-term memory (LSTM), convolutional neural network (CNN), and one-class support vector machine (one-class SVM) with the actual vehicle test data and compared their results. The test results showed the one-class SVM-based models have the highest anomaly detection performance. Additionally, we found that configuring the training architecture to determine normal/anomaly by data instance and conducting one-class classification is proper for detecting anomalies in the target data.


2019 ◽  
Vol 8 (2) ◽  
pp. 3186-3193

The trend of stock price prediction has always been in the focal point of analytical activity in financial domain for both the researchers and investors. Prediction with accuracy is very essential for improved investment decisions that imbibe minimum risk factors. Due to this, majority of investors depend upon that intelligent trading system which generates better forecasting results. As forecasting stock market price with high accuracy is quite a challenging task for the analysts, machine learning has been adopted as one of the popular techniques to predict future trends. Even if there are many recognized analytical time series analysis that are categorized either under soft computing or under conventional statistical techniques like fuzzy logic, artificial neural networks and genetic algorithms, researchers have been looking for more appropriate techniques which can exhibit improved results. In this paper, we developed different hybrid machine learning based prediction models and compared their efficiency. Dimension reduction techniques such as orthogonal forward selection (OFS) and kernel principal component analysis (KPCA) are used separately with support vector regression (SVR) and teaching learning based optimization (TLBO) to predict the stock price of Tata Steel. The performance of both the proposed approach is evaluated with 4143days daily transactional data of Tata steels stocks price, which was collected from Bombay Stock Exchange (BSE). We compared the results of both OFS-SVR-TLBO and KPCA-SVR-TLBO hybrid models and concludes that by incorporating KPCA is more practicable and performs better results than OFS


2020 ◽  
Vol 10 (14) ◽  
pp. 4693
Author(s):  
Seongmun Oh ◽  
Junhyuk Kong ◽  
Minhee Choi ◽  
Jaesung Jung

This study presents a machine learning-based method for predicting the power grid state subjected to heavy-rain hazards. Machine learning models can recognize key knowledge from a dataset without any preliminary knowledge about the dataset. Hence, machine learning methods have been utilized for solving power grid-related problems. Two sets of historical data were used herein: Local weather data and power grid outage data. First, we investigated the heavy-rain-related outage distribution and analyzed the correlated characteristics between weather and outages to characterize the heavy rain events. The analysis results show that multiple weather effects are significant in causing power outages, even under heavy-rain conditions. Furthermore, this study proposes a cost-sensitive prediction method using a support vector machine (SVM) model. The accuracy of the model was improved by applying a cost-sensitive learning algorithm to the SVM model, which was subsequently used to predict the state of the grid. The developed model was evaluated using G-mean values. The proposed method was verified via actual data of a heavy rain event that occurred in South Korea.


2018 ◽  
Vol 1 (1) ◽  
pp. 64-74 ◽  
Author(s):  
Devin Joseph Frey ◽  
Avdesh Mishra ◽  
Md Tamjidul Hoque ◽  
Mahdi Abdelguerfi ◽  
Thomas Soniat

In this work, we address a multi-class classification task of oyster vessel behaviors determination by classifying them into four different classes: fishing, traveling, poling (exploring) and docked (anchored). The main purpose of this work is to automate the oyster vessel behaviors determination task using machine learning and to explore different techniques to improve the accuracy of the oyster vessel behavior prediction problem. To employ machine learning technique, two important descriptors: speed and net speed, are calculated from the trajectory data, recorded by a satellite communication system (Vessel Management System, VMS) attached to the vessels fishing on the public oyster grounds of Louisiana. We constructed a support vector machine (SVM) based method which employs Radial Basis Function (RBF) as a kernel to accurately predict the behavior of oyster vessels. Several validation and parameter optimization techniques were used to improve the accuracy of the SVM classifier. A total 93% of the trajectory data from a July 2013 to August 2014 dataset consisting of 612,700 samples for which the ground truth can be obtained using rule-based classifier is used for validation and independent testing of our method. The results show that the proposed SVM based method is able to correctly classify 99.99% of 612,700 samples using the 10-fold cross validation. Furthermore, we achieved a precision of 1.00, recall of 1.00, F1-score of 1.00 and a test accuracy of 99.99%, while performing an independent test using a subset of 93% of the dataset, which consists of 31,418 points.


2021 ◽  
Vol 12 (1) ◽  
pp. 38
Author(s):  
Venkatesan Chandran ◽  
Chandrashekhar K. Patil ◽  
Alagar Karthick ◽  
Dharmaraj Ganeshaperumal ◽  
Robbi Rahim ◽  
...  

The durability and reliability of battery management systems in electric vehicles to forecast the state of charge (SoC) is a tedious task. As the process of battery degradation is usually non-linear, it is extremely cumbersome work to predict SoC estimation with substantially less degradation. This paper presents the SoC estimation of lithium-ion battery systems using six machine learning algorithms for electric vehicles application. The employed algorithms are artificial neural network (ANN), support vector machine (SVM), linear regression (LR), Gaussian process regression (GPR), ensemble bagging (EBa), and ensemble boosting (EBo). Error analysis of the model is carried out to optimize the battery’s performance parameter. Finally, all six algorithms are compared using performance indices. ANN and GPR are found to be the best methods based on MSE and RMSE of (0.0004, 0.00170) and (0.023, 0.04118), respectively.


Energies ◽  
2021 ◽  
Vol 14 (3) ◽  
pp. 569
Author(s):  
Tianze Lan ◽  
Kittisak Jermsittiparsert ◽  
Sara T. Alrashood ◽  
Mostafa Rezaei ◽  
Loiy Al-Ghussain ◽  
...  

Renewable microgrids are new solutions for enhanced security, improved reliability and boosted power quality and operation in power systems. By deploying different sources of renewables such as solar panels and wind units, renewable microgrids can enhance reducing the greenhouse gasses and improve the efficiency. This paper proposes a machine learning based approach for energy management in renewable microgrids considering a reconfigurable structure based on remote switching of tie and sectionalizing. The suggested method considers the advanced support vector machine for modeling and estimating the charging demand of hybrid electric vehicles (HEVs). In order to mitigate the charging effects of HEVs on the system, two different scenarios are deployed; one coordinated and the other one intelligent charging. Due to the complex structure of the problem formulation, a new modified optimization method based on dragonfly is suggested. Moreover, a self-adaptive modification is suggested, which helps the solutions pick the modification method that best fits their situation. Simulation results on an IEEE microgrid test system show its appropriate and efficient quality in both scenarios. According to the prediction results for the total charging demand of the HEVs, the mean absolute percentage error is 0.978, which is very low. Moreover, the results show a 2.5% reduction in the total operation cost of the microgrid in the intelligent charging compared to the coordinated scheme.


2021 ◽  
Vol 12 (3) ◽  
pp. 94
Author(s):  
Sugam Pokharel ◽  
Pradip Sah ◽  
Deepak Ganta

Electric vehicles (EVs) have emerged as the green energy alternative for conventional vehicles. While various governments promote EVs, people feel “range anxiety” because of their limited driving range or charge capacity. A limited number of charging stations are available, which results in a strong demand for predicting energy consumed by EVs. In this paper, machine learning (ML) models such as multiple linear regression (MLR), extreme gradient boosting (XGBoost), and support vector regression (SVR) were used to investigate the total energy consumption (TEC) by the EVs. The independent variables used for the study include changing real-life situations or external parameters, such as trip distance, tire type, driving style, power, odometer reading, EV model, city, motorway, country roads, air conditioning, and park heating. We compared the ML models’ performance along with the error analysis. A pairwise correlation study showed that trip distance has a high correlation coefficient (0.87) with TEC. XGBoost had better prediction accuracy (~92%) or R2 (0.92). Trip distance, power, heating, and odometer reading were the most important features influencing the TEC, identified using the shapley additive explanations method.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Md Abir Hossen ◽  
Prasoon K Diwakar ◽  
Shankarachary Ragi

AbstractMeasuring soil health indicators (SHIs), particularly soil total nitrogen (TN), is an important and challenging task that affects farmers’ decisions on timing, placement, and quantity of fertilizers applied in the farms. Most existing methods to measure SHIs are in-lab wet chemistry or spectroscopy-based methods, which require significant human input and effort, time-consuming, costly, and are low-throughput in nature. To address this challenge, we develop an artificial intelligence (AI)-driven near real-time unmanned aerial vehicle (UAV)-based multispectral sensing solution (UMS) to estimate soil TN in an agricultural farm. TN is an important macro-nutrient or SHI that directly affects the crop health. Accurate prediction of soil TN can significantly increase crop yield through informed decision making on the timing of seed planting, and fertilizer quantity and timing. The ground-truth data required to train the AI approaches is generated via laser-induced breakdown spectroscopy (LIBS), which can be readily used to characterize soil samples, providing rapid chemical analysis of the samples and their constituents (e.g., nitrogen, potassium, phosphorus, calcium). Although LIBS was previously applied for soil nutrient detection, there is no existing study on the integration of LIBS with UAV multispectral imaging and AI. We train two machine learning (ML) models including multi-layer perceptron regression and support vector regression to predict the soil nitrogen using a suite of data classes including multispectral characteristics of the soil and crops in red (R), near-infrared, and green (G) spectral bands, computed vegetation indices (NDVI), and environmental variables including air temperature and relative humidity (RH). To generate the ground-truth data or the training data for the machine learning models, we determine the N spectrum of the soil samples (collected from a farm) using LIBS and develop a calibration model using the correlation between actual TN of the soil samples and the maximum intensity of N spectrum. In addition, we extract the features from the multispectral images captured while the UAV follows an autonomous flight plan, at different growth stages of the crops. The ML model’s performance is tested on a fixed configuration space for the hyper-parameters using various hyper-parameter optimization techniques at three different wavelengths of the N spectrum.


2021 ◽  
Author(s):  
Husam Rajab ◽  
Mussa Ebrahim ◽  
Tibor Cinkler

AbstractRecently, one of the most common needs of people are to be connected to the Internet anytime, anywhere, anyhow. The Internet of Things is a materialized paradigm in which everyday objects are implemented with Internet connectivity, enabling them to collect and interchange information. As energy is expected to be more expensive, the energy supply is often not available for IoT devices, the low power wide area networks attempt to be the solution to this problem. LoRaWAN provides radio coverage over long distances by enhancing the reach of the base stations via adapting transmission rates, transmission power, modulation, duty cycles, etc. This paper aims to decrease the power consumption using machine learning and deep neural network by applying support vector regression and deep neural network algorithms, which can support to extend the battery lifetime.


Sign in / Sign up

Export Citation Format

Share Document