scholarly journals Cloud Extraction from Chinese High Resolution Satellite Imagery by Probabilistic Latent Semantic Analysis and Object-Based Machine Learning

2016 ◽  
Vol 8 (11) ◽  
pp. 963 ◽  
Author(s):  
Kai Tan ◽  
Yongjun Zhang ◽  
Xin Tong
Land ◽  
2021 ◽  
Vol 10 (6) ◽  
pp. 648
Author(s):  
Guie Li ◽  
Zhongliang Cai ◽  
Yun Qian ◽  
Fei Chen

Enriching Asian perspectives on the rapid identification of urban poverty and its implications for housing inequality, this paper contributes empirical evidence about the utility of image features derived from high-resolution satellite imagery and machine learning approaches for identifying urban poverty in China at the community level. For the case of the Jiangxia District and Huangpi District of Wuhan, image features, including perimeter, line segment detector (LSD), Hough transform, gray-level cooccurrence matrix (GLCM), histogram of oriented gradients (HoG), and local binary patterns (LBP), are calculated, and four machine learning approaches and 25 variables are applied to identify urban poverty and relatively important variables. The results show that image features and machine learning approaches can be used to identify urban poverty with the best model performance with a coefficient of determination, R2, of 0.5341 and 0.5324 for Jiangxia and Huangpi, respectively, although some differences exist among the approaches and study areas. The importance of each variable differs for each approach and study area; however, the relatively important variables are similar. In particular, four variables achieved relatively satisfactory prediction results for all models and presented obvious differences in varying communities with different poverty levels. Housing inequality within low-income neighborhoods, which is a response to gaps in wealth, income, and housing affordability among social groups, is an important manifestation of urban poverty. Policy makers can implement these findings to rapidly identify urban poverty, and the findings have potential applications for addressing housing inequality and proving the rationality of urban planning for building a sustainable society.


2016 ◽  
Vol 8 (9) ◽  
pp. 715 ◽  
Author(s):  
Ting Bai ◽  
Deren Li ◽  
Kaimin Sun ◽  
Yepei Chen ◽  
Wenzhuo Li

2020 ◽  
Author(s):  
Majid Bayati ◽  
Mohammad Danesh-Yazdi

<p>The spatiotemporal dynamics of salinity in hypersaline lakes is strongly dependent on the rate of water flow feeding the lake, evaporation rate, and the phenomena of precipitation and dissolution. Although in-situ observations are most reliable in quantifying water quality variables, the spatiotemporal distribution of such data are typically limited or cannot be readily extrapolated for long-term projections. Alternatively, remotely-sensed imagery has facilitated less expensive and stronger ability to estimate water quality over a wide range of spatiotemporal resolutions. This study introduces a machine learning model that leverages in-situ measurements and high-resolution satellite imagery to estimate the salinity concentration in water bodies. To this end, 123 points were sampled in April and July of 2019 across the Lake Urmia surface covering the wide range of salinity fluctuations. Among the artificial neural networks, ANFIS, and linear regression tools examined to determine the relationship between salinity and surface reflectance, artificial neural networks yielded the best accuracy evidenced by R<sup>2</sup> = 0.94 and RMSE = 6.8%. The results show that the seasonal change of salinity is linearly correlated with the volume of water feeding the lake, witnessing that dilution imposes a stronger control on the salinity than bed salt dissolution. The impact of disturbance in the lake circulation due to the causeway is also evident from the sharp changes of salinity around the bridge piers near spring when the mixing of fresh and hypersaline water from the southern and northern parts, respectively, takes place. The results of this study prove the promising potential of machine learning tools fed multi-spectral satellite information to map other water quality metrics than salinity as well.</p>


2010 ◽  
Vol 2 (12) ◽  
pp. 2748-2772 ◽  
Author(s):  
Cerian Gibbes ◽  
Sanchayeeta Adhikari ◽  
Luke Rostant ◽  
Jane Southworth ◽  
Youliang Qiu

2015 ◽  
Vol 3 (1) ◽  
pp. 77-94
Author(s):  
Maryam Nikfar ◽  
Mohammad Javad Valadan Zoej ◽  
Mehdi Mokhtarzade ◽  
Mahdi Aliyari Shoorehdeli ◽  
◽  
...  

Author(s):  
M. Coslu ◽  
N. K. Sonmez ◽  
D. Koc-San

Pixel-based classification method is widely used with the purpose of detecting land use and land cover with remote sensing technology. Recently, object-based classification methods have begun to be used as well as pixel-based classification method on high resolution satellite imagery. In the studies conducted, it is indicated that object-based classification method has more successful results than other classification methods. While pixel-based classification method is performed according to the grey value of pixels, object-based classification process is executed by generating imagery segmentation and updatable rule sets. In this study, it was aimed to detect and map the greenhouses from object-based classification method by using high resolution satellite imagery. The study was carried out in the Antalya province which includes greenhouse intensively. The study consists of three main stages including segmentation, classification and accuracy assessment. At the first stage, which was segmentation, the most important part of the object-based imagery analysis; imagery segmentation was generated by using basic spectral bands of high resolution Worldview-2 satellite imagery. At the second stage, applying the nearest neighbour classifier to these generated segments classification process was executed, and a result map of the study area was generated. Finally, accuracy assessments were performed using land studies and digital data of the area. According to the research results, object-based greenhouse classification using high resolution satellite imagery had over 80% accuracy.


Sign in / Sign up

Export Citation Format

Share Document