scholarly journals Line-Detection Based on the Sum of Gradient Angle Differences

2019 ◽  
Vol 10 (1) ◽  
pp. 254 ◽  
Author(s):  
Suyoung Seo

This paper presents a method to detect line pixels based on the sum of gradient angle differences (SGAD). The gradient angle differences are calculated by comparing the four pairs of gradients arising from eight neighboring pixels. In addition, a method to classify line pixels into ridges and valleys is proposed. Furthermore, a simple line model is defined for simulation experiments. Experiments are conducted with simulation images generated using the simple line model for three line-detection methods: second-derivatives (SD)-based method, extremity-count (EC)-based method, and proposed method. The results of the simulation experiments show that the proposed method produces more accurate line-detection results than the other methods in terms of the root mean square error when the line width is relatively large. In addition, the experiments conducted with natural images show that the SD- and EC-based methods suffer from bifurcation, fragmentation, and missing pixels. By contrast, for the original and the noise-contaminated versions of the natural images, the proposed SGAD-based line-detection method is affected by such problems to a considerably smaller extent than the other two methods.

2019 ◽  
Vol 11 (13) ◽  
pp. 1598 ◽  
Author(s):  
Hua Su ◽  
Xin Yang ◽  
Wenfang Lu ◽  
Xiao-Hai Yan

Retrieving multi-temporal and large-scale thermohaline structure information of the interior of the global ocean based on surface satellite observations is important for understanding the complex and multidimensional dynamic processes within the ocean. This study proposes a new ensemble learning algorithm, extreme gradient boosting (XGBoost), for retrieving subsurface thermohaline anomalies, including the subsurface temperature anomaly (STA) and the subsurface salinity anomaly (SSA), in the upper 2000 m of the global ocean. The model combines surface satellite observations and in situ Argo data for estimation, and uses root-mean-square error (RMSE), normalized root-mean-square error (NRMSE), and R2 as accuracy evaluations. The results show that the proposed XGBoost model can easily retrieve subsurface thermohaline anomalies and outperforms the gradient boosting decision tree (GBDT) model. The XGBoost model had good performance with average R2 values of 0.69 and 0.54, and average NRMSE values of 0.035 and 0.042, for STA and SSA estimations, respectively. The thermohaline anomaly patterns presented obvious seasonal variation signals in the upper layers (the upper 500 m); however, these signals became weaker as the depth increased. The model performance fluctuated, with the best performance in October (autumn) for both STA and SSA, and the lowest accuracy occurred in January (winter) for STA and April (spring) for SSA. The STA estimation error mainly occurred in the El Niño-Southern Oscillation (ENSO) region in the upper ocean and the boundary of the ocean basins in the deeper ocean; meanwhile, the SSA estimation error presented a relatively even distribution. The wind speed anomalies, including the u and v components, contributed more to the XGBoost model for both STA and SSA estimations than the other surface parameters; however, its importance at deeper layers decreased and the contributions of the other parameters increased. This study provides an effective remote sensing technique for subsurface thermohaline estimations and further promotes long-term remote sensing reconstructions of internal ocean parameters.


2020 ◽  
Vol 16 (2) ◽  
pp. 155014772090782
Author(s):  
Shi Qinglan ◽  
Shi Yujiao ◽  
Liu Xiaochen ◽  
Mei Shuli ◽  
Feng Lei

The multilayer soil moisture Internet of things sensor is designed to monitor the moisture of multiple soil profiles in real time. Its sensitivity and accuracy are of great concern to improve the performance of sensors. This article introduces the system composition of the end-cloud integrated multilayer soil moisture Internet of things sensor and then focuses on the design of key technologies, such as the moisture detection circuit, the time division multiplexing detection technology, and the deredundancy circuit in the analog–digital integrated design. The performance of the soil moisture detection circuit is directly related to the measurement accuracy of the sensor. A detection method is proposed using a high-frequency double-resonance circuit, which can detect small changes in moisture by changing the circuit detuning voltage. The maximum root mean square error of the calibration is less than 1.35% for five typical soils from different places. Compared with that of an independent detection method, the output consistency of the time division multiplexing detection is significantly improved by using the time division multiplexing detection method, which has a root mean square error of only 0.12%. In order to reduce errors caused by inconsistency in each burial, the gravimetric analysis is used in the sensitivity monitoring test, which shows that small changes in soil moisture can be detected by the circuit.


Author(s):  
NA LU ◽  
ZUREN FENG

There is no parametric formulation of corner, so the conventional Hough transform cannot be employed to detect corners directly. A random corner detection method is developed in this paper based on a new concept "accumulative intersection space" under Monte Carlo scheme. This method transforms the corner detection in the image space into local maxima localization in the accumulative intersection space where the intersections are accumulated by random computations. The proposed algorithm has been demonstrated by both theory and experiments. The proposed algorithm is isotropic, robust to image rotation, insensitive to noise and false corners on diagonal edges. Unlike the other existing contour based corner detection methods, our algorithm can effectively avoid the influence of the edge detectors, such as rounding corners or line interceptions. Extensive comparisons among our approach and the other detectors including Harris operator, Fei Shen and Han Wang detector, Han Wang and Brady detector, Foveated Visual Search method and SIFT feature, have shown the effectiveness of our method.


2018 ◽  
Vol 7 (2.7) ◽  
pp. 44
Author(s):  
S Bhavika ◽  
B Prema Sindhuri ◽  
G Bhavana

Electronic mails have become a part of our daily lives to exchange different type of information and messages. They provide a great medium to communicate with large number of people in a single stretch. This made so many marketing groups to think that email is a great platform for publicizing their goods or products. Not only are these marketers there so many other types of users who wants to make use of these emails for their own needs. As the time prolongs, this had become a problem for the other users because of the continuous undesired electronic messages sent by different marketing and some other unauthorized users. These messages are termed as spam messages. These spam mails have become a serious issue and there is a need to clear away all these junk mails. To do so different spam detection methodologies are developed and employed for providing an effective mailing service to the users. In this paper, we present various spam detection methods that are existing and also finding the accurate, effective and reliable spam detection method.


Author(s):  
Mohamed Ibrahim Waly

Abstract Academic accreditation criteria require a powerful method to evaluate program outcomes (POs). The most recent studies recommend the use of both direct and indirect assessments to evaluate the actual achievements of POs. This study aimed to provide an easily implemented method based on direct assessment and other integrated variables that reflect the reality of students' achievement of POs. The suggested method, based on weight average equation, was presented and compared with the other two methods. The comparative study was designed on the basis of two steps. First, the results of each method were compared with the result of the general capacity exam, using root mean square error (27 students, male, from level four with 6 courses). The second step was based on statistical analysis (paired t-test) of results from the methods for the same batch of students (from level 3 to level 6, with 22 courses). In the first step, the suggested method resulted in the lowest root mean square error relative to the general capacity exam (9%). In the second step there was a significant difference between the mean of the suggested method and other methods (69.8040 ± 6.59, P-value < 0.05). The evaluation procedure for POs is an integral component of the education process. Various variables are integrated to reflect the actual achievement of students. The suggested method reflected the reality of PO achievements more accurately than the other methods, which proved sensitivity to the number of course learning outcomes (CLOs).


2020 ◽  
Vol 501 (1) ◽  
pp. L18-L22
Author(s):  
M A Thompson

ABSTRACT In the light of the recent announcement of the discovery of the potential biosignature phosphine in the atmosphere of Venus, I present an independent reanalysis of the original James Clerk Maxwell Telescope (JCMT) data to assess the statistical reliability of the detection. Two line detection methods are explored: low-order polynomial fits and higher order multiple polynomial fits. A non-parametric bootstrap analysis reveals that neither line detection method is able to recover a statistically significant detection. Similar to the results of other reanalyses of ALMA(Atacama Large Millimetre Array) Venus spectra, the polynomial fitting process results in false positive detections in the JCMT spectrum. There is thus no significant evidence for phosphine absorption in the JCMT Venus spectra.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Roberto Coscarelli ◽  
Giulio Nils Caroletti ◽  
Magnus Joelsson ◽  
Erik Engström ◽  
Tommaso Caloiero

AbstractIn order to correctly detect climate signals and discard possible instrumentation errors, establishing coherent data records has become increasingly relevant. However, since real measurements can be inhomogeneous, their use for assessing homogenization techniques is not directly possible, and the study of their performance must be done on homogeneous datasets subjected to controlled, artificial inhomogeneities. In this paper, considering two European temperature networks over the 1950–2005 period, up to 7 artificial breaks and an average of 107 missing data per station were introduced, in order to determine that mean square error, absolute bias and factor of exceedance can be meaningfully used to validate the best-performing homogenization technique. Three techniques were used, ACMANT and two versions of HOMER: the standard, automated setup mode and a manual setup. Results showed that the HOMER techniques performed better regarding the factor of exceedance, while ACMANT was best with regard to absolute error and root mean square error. Regardless of the technique used, it was also established that homogenization quality anti-correlated meaningfully to the number of breaks. On the other hand, as missing data are almost always replaced in the two HOMER techniques, only ACMANT performance is significantly, negatively affected by the amount of missing data.


2020 ◽  
Vol 1 ◽  
pp. 6
Author(s):  
Alexandra Albu ◽  
Alina Enescu ◽  
Luigi Malagò

The ability to automatically detect anomalies in brain MRI scans is of great importance in computer-aided diagnosis. Unsupervised anomaly detection methods work primarily by learning the distribution of healthy images and identifying abnormal tissues as outliers. We propose a slice-wise detection method which first trains a pair of autoencoders on two different datasets, one with healthy individuals and the other one with images of normal and tumoral tissues. Next, it classifies slices based on the distance in the latent space between the enconding of the image and the encoding of the reconstructed image, obtained through the autoencoder trained on healthy images only. We validate our approach with a series of preliminary experiments on the HCP and BRATS-15 datasets.


MAUSAM ◽  
2022 ◽  
Vol 53 (2) ◽  
pp. 119-126
Author(s):  
R. K. MALL ◽  
B. R. D. GUPTA

Actual evapotranspiration of wheat crop during different year from 1978-79 to 1992-93 was measured daily in Varanasi, Uttar Pradesh using lysimeter. In this study three evapotranspiration computing models namely Doorenbos and Pruitt, Thornthwaite and Soil Plant Atmosphere Water (SPAW) have been used. Comparisons of these three methods show that the SPAW model is better than the other two methods for evapotraspiration estimation. In the present study the MBE (Mean-Bias-Error), RMSE (Root Mean Square Error) and t-statistic have also been obtained for better evaluations of a model performance.


Sign in / Sign up

Export Citation Format

Share Document