scholarly journals Deep Learning-Based Long-Term Power Allocation Scheme for NOMA Downlink System in S-IoT

IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 86288-86296 ◽  
Author(s):  
Yunyu Sun ◽  
Ye Wang ◽  
Jian Jiao ◽  
Shaohua Wu ◽  
Qinyu Zhang
2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Xin Mao ◽  
Jun Kang Chow ◽  
Pin Siang Tan ◽  
Kuan-fu Liu ◽  
Jimmy Wu ◽  
...  

AbstractAutomatic bird detection in ornithological analyses is limited by the accuracy of existing models, due to the lack of training data and the difficulties in extracting the fine-grained features required to distinguish bird species. Here we apply the domain randomization strategy to enhance the accuracy of the deep learning models in bird detection. Trained with virtual birds of sufficient variations in different environments, the model tends to focus on the fine-grained features of birds and achieves higher accuracies. Based on the 100 terabytes of 2-month continuous monitoring data of egrets, our results cover the findings using conventional manual observations, e.g., vertical stratification of egrets according to body size, and also open up opportunities of long-term bird surveys requiring intensive monitoring that is impractical using conventional methods, e.g., the weather influences on egrets, and the relationship of the migration schedules between the great egrets and little egrets.


Electronics ◽  
2021 ◽  
Vol 10 (10) ◽  
pp. 1151
Author(s):  
Carolina Gijón ◽  
Matías Toril ◽  
Salvador Luna-Ramírez ◽  
María Luisa Marí-Altozano ◽  
José María Ruiz-Avilés

Network dimensioning is a critical task in current mobile networks, as any failure in this process leads to degraded user experience or unnecessary upgrades of network resources. For this purpose, radio planning tools often predict monthly busy-hour data traffic to detect capacity bottlenecks in advance. Supervised Learning (SL) arises as a promising solution to improve predictions obtained with legacy approaches. Previous works have shown that deep learning outperforms classical time series analysis when predicting data traffic in cellular networks in the short term (seconds/minutes) and medium term (hours/days) from long historical data series. However, long-term forecasting (several months horizon) performed in radio planning tools relies on short and noisy time series, thus requiring a separate analysis. In this work, we present the first study comparing SL and time series analysis approaches to predict monthly busy-hour data traffic on a cell basis in a live LTE network. To this end, an extensive dataset is collected, comprising data traffic per cell for a whole country during 30 months. The considered methods include Random Forest, different Neural Networks, Support Vector Regression, Seasonal Auto Regressive Integrated Moving Average and Additive Holt–Winters. Results show that SL models outperform time series approaches, while reducing data storage capacity requirements. More importantly, unlike in short-term and medium-term traffic forecasting, non-deep SL approaches are competitive with deep learning while being more computationally efficient.


Sign in / Sign up

Export Citation Format

Share Document