Development of an optical joint transform correlation system for fingerprint recognition

1999 ◽  
Vol 38 (7) ◽  
pp. 1205 ◽  
Author(s):  
Yuji Kobayashi
1996 ◽  
Vol 3 (6) ◽  
pp. A403-A405 ◽  
Author(s):  
Yuji Kobayashi ◽  
Haruyoshi Toyoda ◽  
Naohisa Mukohzaka ◽  
Narihiro Yoshida ◽  
Tsutomu Hara

1976 ◽  
Vol 15 (01) ◽  
pp. 21-28 ◽  
Author(s):  
Carmen A. Scudiero ◽  
Ruth L. Wong

A free text data collection system has been developed at the University of Illinois utilizing single word, syntax free dictionary lookup to process data for retrieval. The source document for the system is the Surgical Pathology Request and Report form. To date 12,653 documents have been entered into the system.The free text data was used to create an IRS (Information Retrieval System) database. A program to interrogate this database has been developed to numerically coded operative procedures. A total of 16,519 procedures records were generated. One and nine tenths percent of the procedures could not be fitted into any procedures category; 6.1% could not be specifically coded, while 92% were coded into specific categories. A system of PL/1 programs has been developed to facilitate manual editing of these records, which can be performed in a reasonable length of time (1 week). This manual check reveals that these 92% were coded with precision = 0.931 and recall = 0.924. Correction of the readily correctable errors could improve these figures to precision = 0.977 and recall = 0.987. Syntax errors were relatively unimportant in the overall coding process, but did introduce significant error in some categories, such as when right-left-bilateral distinction was attempted.The coded file that has been constructed will be used as an input file to a gynecological disease/PAP smear correlation system. The outputs of this system will include retrospective information on the natural history of selected diseases and a patient log providing information to the clinician on patient follow-up.Thus a free text data collection system can be utilized to produce numerically coded files of reasonable accuracy. Further, these files can be used as a source of useful information both for the clinician and for the medical researcher.


1997 ◽  
Vol 51 (6-7) ◽  
pp. 25-27
Author(s):  
V. N. Frankov ◽  
G. Y. Osokin ◽  
O. V. Gavrentiuk ◽  
A. I. Samokhvalov

2020 ◽  
Author(s):  
Ganesh Awasthi ◽  
Dr. Hanumant Fadewar ◽  
Almas Siddiqui ◽  
Bharatratna P. Gaikwad

Author(s):  
Mariya Nazarkevych ◽  
Serhii Dmytruk ◽  
Volodymyr Hrytsyk ◽  
Olha Vozna ◽  
Anzhela Kuza ◽  
...  

Background: Systems of the Internet of Things are actively implementing biometric systems. For fast and high-quality recognition in sensory biometric control and management systems, skeletonization methods are used at the stage of fingerprint recognition. The analysis of the known skeletonization methods of Zhang-Suen, Hilditch, Ateb-Gabor with the wave skeletonization method has been carried out and it shows a good time and qualitative recognition results. Methods: The methods of Zhang-Suen, Hildich and thinning algorithm based on Ateb-Gabor filtration, which form the skeletons of biometric fingerprint images, are considered. The proposed thinning algorithm based on Ateb-Gabor filtration showed better efficiency because it is based on the best type of filtering, which is both a combination of the classic Gabor function and the harmonic Ateb function. The combination of this type of filtration makes it possible to more accurately form the surroundings where the skeleton is formed. Results: Along with the known ones, a new Ateb-Gabor filtering algorithm with the wave skeletonization method has been developed, the recognition results of which have better quality, which allows to increase the recognition quality from 3 to 10%. Conclusion: The Zhang-Suen algorithm is a 2-way algorithm, so for each iteration, it performs two sets of checks during which pixels are removed from the image. Zhang-Suen's algorithm works on a plot of black pixels with eight neighbors. This means that the pixels found along the edges of the image are not analyzed. Hilditch thinning algorithm occurs in several passages, where the algorithm checks all pixels and decides whether to replace a pixel from black to white if certain conditions are satisfied. This Ateb-Gabor filtering will provide better performance, as it allows to obtain more hollow shapes, organize a larger range of curves. Numerous experimental studies confirm the effectiveness of the proposed method.


Sign in / Sign up

Export Citation Format

Share Document