The Spatiotemporal Evolution of Urban Impervious Surface for Chengdu, China

2021 ◽  
Vol 87 (7) ◽  
pp. 491-502
Author(s):  
Mujie Li ◽  
Zezhong Zheng ◽  
Mingcang Zhu ◽  
Yue He ◽  
Jun Xia ◽  
...  

The spatiotemporal evolution of an impervious surface (IS) is significant for urban planning. In this paper, the IS was extracted and its spatiotemporal evolution for the Chengdu urban area was analyzed based on Landsat imagery. Our experimental results indicated that convolutional neural networks achieved the better performance with an overall accuracy of 98.32%, Kappa coefficient of 0.98, and Macro F1 of 98.28%, and the farmland was replaced by IS from 2001 to 2017, and the IS area (ISA) increased by 51.24 km2; that is, the growth rate was up to 13.8% in sixteen years. According to the landscape metrics, the IS expanded and agglomerated into large patches from small fragmented ones. In addition, the gross domestic product change of the secondary industry was similar to the change of ISA between 2001 and 2017. Thus, the spatiotemporal evolution of IS was associated with the economic development of the Chengdu urban area in the past sixteen years.

Author(s):  
Ruofan Liao ◽  
Paravee Maneejuk ◽  
Songsak Sriboonchitta

In the past, in many areas, the best prediction models were linear and nonlinear parametric models. In the last decade, in many application areas, deep learning has shown to lead to more accurate predictions than the parametric models. Deep learning-based predictions are reasonably accurate, but not perfect. How can we achieve better accuracy? To achieve this objective, we propose to combine neural networks with parametric model: namely, to train neural networks not on the original data, but on the differences between the actual data and the predictions of the parametric model. On the example of predicting currency exchange rate, we show that this idea indeed leads to more accurate predictions.


Author(s):  
Carlos Lassance ◽  
Vincent Gripon ◽  
Antonio Ortega

For the past few years, deep learning (DL) robustness (i.e. the ability to maintain the same decision when inputs are subject to perturbations) has become a question of paramount importance, in particular in settings where misclassification can have dramatic consequences. To address this question, authors have proposed different approaches, such as adding regularizers or training using noisy examples. In this paper we introduce a regularizer based on the Laplacian of similarity graphs obtained from the representation of training data at each layer of the DL architecture. This regularizer penalizes large changes (across consecutive layers in the architecture) in the distance between examples of different classes, and as such enforces smooth variations of the class boundaries. We provide theoretical justification for this regularizer and demonstrate its effectiveness to improve robustness on classical supervised learning vision datasets for various types of perturbations. We also show it can be combined with existing methods to increase overall robustness.


2021 ◽  
Vol 7 (6) ◽  
pp. eabb7118
Author(s):  
E. Harris ◽  
E. Diaz-Pines ◽  
E. Stoll ◽  
M. Schloter ◽  
S. Schulz ◽  
...  

Nitrous oxide is a powerful greenhouse gas whose atmospheric growth rate has accelerated over the past decade. Most anthropogenic N2O emissions result from soil N fertilization, which is converted to N2O via oxic nitrification and anoxic denitrification pathways. Drought-affected soils are expected to be well oxygenated; however, using high-resolution isotopic measurements, we found that denitrifying pathways dominated N2O emissions during a severe drought applied to managed grassland. This was due to a reversible, drought-induced enrichment in nitrogen-bearing organic matter on soil microaggregates and suggested a strong role for chemo- or codenitrification. Throughout rewetting, denitrification dominated emissions, despite high variability in fluxes. Total N2O flux and denitrification contribution were significantly higher during rewetting than for control plots at the same soil moisture range. The observed feedbacks between precipitation changes induced by climate change and N2O emission pathways are sufficient to account for the accelerating N2O growth rate observed over the past decade.


2021 ◽  
Vol 75 (3) ◽  
pp. 76-82
Author(s):  
G.T. Balakayeva ◽  
◽  
D.K. Darkenbayev ◽  
M. Turdaliyev ◽  
◽  
...  

The growth rate of these enterprises has increased significantly in the last decade. Research has shown that over the past two decades, the amount of data has increased approximately tenfold every two years - this exceeded Moore's Law, which doubles the power of processors. About thirty thousand gigabytes of data are accumulated every second, and their processing requires an increase in the efficiency of data processing. Uploading videos, photos and letters from users on social networks leads to the accumulation of a large amount of data, including unstructured ones. This leads to the need for enterprises to work with big data of different formats, which must be prepared in a certain way for further work in order to obtain the results of modeling and calculations. In connection with the above, the research carried out in the article on processing and storing large data of an enterprise, developing a model and algorithms, as well as using new technologies is relevant. Undoubtedly, every year the information flows of enterprises will increase and in this regard, it is important to solve the issues of storing and processing large amounts of data. The relevance of the article is due to the growing digitalization, the increasing transition to professional activities online in many areas of modern society. The article provides a detailed analysis and research of these new technologies.


2022 ◽  
pp. 1-27
Author(s):  
Clifford Bohm ◽  
Douglas Kirkpatrick ◽  
Arend Hintze

Abstract Deep learning (primarily using backpropagation) and neuroevolution are the preeminent methods of optimizing artificial neural networks. However, they often create black boxes that are as hard to understand as the natural brains they seek to mimic. Previous work has identified an information-theoretic tool, referred to as R, which allows us to quantify and identify mental representations in artificial cognitive systems. The use of such measures has allowed us to make previous black boxes more transparent. Here we extend R to not only identify where complex computational systems store memory about their environment but also to differentiate between different time points in the past. We show how this extended measure can identify the location of memory related to past experiences in neural networks optimized by deep learning as well as a genetic algorithm.


Sign in / Sign up

Export Citation Format

Share Document