Improving the Efficiency of Automated Latent Fingerprint Identification Using Stack of Convolutional Auto-encoder

In this paper, a method for improving the efficiency of latent fingerprint segmentation and detection system is presented. Structural detection and precise segmentation of fingerprints otherwise not visible to the naked eye (called latents), provide the basis for automatic identification of latent f...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:International Conference on Parallel, Distributed and Grid Computing (PDGC ...) s. 191 - 196
Hlavní autoři: Chhabra, Megha, Shukla, Manoj Kumar, Ravulakollu, Kiran Kumar
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 06.11.2020
Témata:
ISSN:2573-3079
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:In this paper, a method for improving the efficiency of latent fingerprint segmentation and detection system is presented. Structural detection and precise segmentation of fingerprints otherwise not visible to the naked eye (called latents), provide the basis for automatic identification of latent fingerprints. The method is based on the assumption, that including detection of relevant structure of interest from latent fingerprint image into an effective segmentation model pipeline improves the effectiveness of the model and efficiency of the automated segmentation. The approach discards detections of poor-quality due to noise, inadequate data, misplaced structures of interests from multiple instances of fingermarks in the image etc. A collaborative detector-segmentation approach is proposed which establishes reproducibility and repeatability of the model, consequently increasing the efficiency of the frame of work. The results are obtained on IIIT -DCLF database. Performing saliency-based detection using color based visual distortion reducing the subsequent information processing cost through a stack of the convolutional autoencoder. The results obtained signify significant improvement over published results.
ISSN:2573-3079
DOI:10.1109/PDGC50313.2020.9315746