friction ridge analysis towards lights-out ridge analysis. towards lights-out latent recognition....

Download Friction Ridge Analysis Towards Lights-out Ridge Analysis. Towards Lights-out Latent Recognition. Elham…

Post on 18-Jul-2018

212 views

Category:

Documents

0 download

Embed Size (px)

TRANSCRIPT

  • Friction Ridge AnalysisTowards Lights-out Latent Recognition

    Elham TabassiImage Group NIST

    August 31, 2015SAMSI Forensics Opening Workshop

  • Outline

    We, at NISTNIST Biometric Evaluations1:N Latent Fingerprint Matching

    1:N Metrics

    Current latent matching process Current research at NIST Closing

  • Image GroupHistory Who are we?An old group that traces its origins to NBS based on work started by Jack Wegstein in the1950s primarily to solve numerical problemson IBM 704 and SEAC (Standards ElectronicAutomatic Computer).Some of these early problems included mesh processing in 2D space, with applications in physics, but also contemporary image processing.

    Then something happened in 1966

    3

    {

    Ethel Marden, Mathematician & Computer Programmer, using the NBS Standards

    Electronic Automatic Computer (SEAC), ~1950

  • Image GroupHistory Who are we?

    NBS accepted the challenge

    4

    {

  • Image GroupHistory

    This research opened several other doors

    5

    {

  • Image GroupHistory Expanded Research in Friction Ridge

    As systems matured in the 70s, the need for interop emerged 6

    {

  • Image GroupHistory Expanded to Standards

    Just how big did this standard get?

    7

    { Another key landmark in our work occurred in 1986 with the introduction of the ANSI/NIST Standard for the exchange of biometric information between systems.

  • Image GroupOur Research and The World International Data StandardThe ANSI/NIST standard has a direct impact on virtually all the biometric data being operated on in the world. This includes capture, interchange of at least 2 million images daily in the United States alone.

    Moving to the present, what are our core functions 8

    {

  • Image GroupSnapshot of Active Projects

    9

    {Standards

    Pattern Forensics

    Emerging

    ANSI/NIST

    PIV

    ISO

    Contactless

    OSAC scs

    OASIS

    Latent

    Matcher Testing

    Compression

    Segmentation

    Quality

    MINEX

    FpVTE

    USG Matcher

    PFT

    Image Group

    Evaluations Iris Recognition

    Face Recognition Still images

    Video images

    IREX

    FRVT

    FIVE

    Fingerprint

    Compression Study

    CODEC Certification

    Friction Ridge Analysis

    Face Black Box

    Scientific Underpinnings

    Challenge Problems

    SlapSeg

    SMT

  • Technical Approach :: provide quantitative support

    Identify gaps/outreach (NWIP,AMD)

    Research + (large scale) evaluation

    Submit comment + Technical

    contribution

    Active participation Advocate for

    NIST/USG positions

    Test performance

    and interoperability of the standard

    Development of clear, robust, tested, and

    implementable content through extensive study

    and experiments, e.g. IREX I + IQCE

    aimed at strengthening the science behind the claims or preventing overly prescriptive requirements

    e.g. Livenesse.g. MINEX 04

    e.g. IREX

    Serve as EditorHost workshops

    10

  • NIST BIOMETRIC EVALUATIONShttp://www.nist.gov/itl/iad/ig/biometric_evaluations.cfm

    12

  • Role of Technology testEvaluation of core technical capability of biometric matching technologies

    why Advance science of metrology

    Facilitate innovation through competition

    Help US industry

    Often developers do not have enough data for testing particularly operational data

    Close knowledge gap

    Ditto standard gap

    impact Advance the current state

    Measurement science and Technology

    Improve accuracy Failure analysis

    Improve implementations adherence to standards and protocols

    Procurement ready requirements

    13

    PresenterPresentation NotesTo quantify the state-of-art and also do failure analysis.As Ralph pointed out, biometric recognition fails, we was to measure how often and why they fail.All algorithm and no human

  • Fingerprint research and evaluationsNIST Finger Image Quality.Measures utility of fingerprint images.

    NFIQ 1.0 NIST IR 7151. Published 2004.NFIQ 2.0 Summer 2015.

    Large scale one-to-many evaluation of fingerprint recognition algorithms.

    NIST IR 8034. Published January 2015.

    ELFTEvaluation of latent fingerprint

    technologies

    Accuracy test of latent fingerprint searches using features marked by examiners+ automated feature extraction and matching technologies.

    NIST IR 7775. Published March 2011.

    Evaluation of performance and interoperability of core minutia template encoding and matching capabilities.

    Ongoing-test.NIST IR 7296. Published March 2006.MINUTIA EXCHANGE

    FPVTE 2012

    4

    15

  • Do these two impressions come from the same finger?

    Forensics Friction Ridge Analysis

    16

    PresenterPresentation NotesThis is the first of the 3 talks on this newly established project here at NIST.The other two talks will be given by Soweon and Hari on Thursday.I will provide background and overview of why we do what we are doing which basically is to quantify the weight of evidence and uncertainty in friction ridge forensic determination.

  • Fingerprint recognition

    Exemplar-to-Exemplar Latent-to-Exemplar 63.4% rank-1 accuracy in

    lights-out mode 68.2% rank-1 accuracy with

    full markup features (ELFT-EFS 2012, M. Indovina et al., Evaluation of Latent Fingerprint Technologies: Extended Feature Sets, NISTIR 7859)

    One finger accuracy FNIR=0.0198 @ FPIR=0.001 (Tabassi et al., Performance evaluation of fingerprint open-set identification algorithms, IJCB 2014)

    17

    PresenterPresentation NotesOur purpose is not to demonstrate the individuality of a complete and well-reproduced fingerprint, but to assess the evidential contribution of fingermarks that can be partial, distorted, and with a poor signal/noise ratio. Uniqueness does not guarantee that prints from two different people are always sufficiently different that they cannot be confused, or that two impressions made by the same finger will also be sufficiently similar to be discerned as coming from the same source. The impression left by a given finger will differ every time, because of inevitable variations in pressure, which change the degree of contact between each part of the ridge structure and the impression medium. None of these variabilitiesof features across a population of fingers or of repeated impressions left by the same fingerhas been characterized, quantified, or compared.

  • Latent Fingerprints

    Smudged Latent Complex Background Overlapped Latents

    18

    PresenterPresentation NotesLatents recovered from crime scenes are often limited in size, of poor quality, distorted and affected by interference from the substrate.

  • 1:N Fingerprint Identification

    Templatei

    N templateEnrollment

    Database

    1. Alice 0.022. Bob 0.343. Christophe 0.384. David 0.395. Ernie 0.45

    Candidate List

    FNIR, aka Miss Rate

    FPIRAka False Alarm Rate

    Latent image

    Features

    Search Template

    19

  • Candidate Lists, Rank, Thresholds

    Given L candidates, analyst can inspect All L Go only to rank R < L Only look at

    candidates with score T

    Or some combination of R and T

    3.142 1

    2.998 2

    1.626 3

    0.707 4

    0.330 5

    0.198 6

    0.074 7

    0.016 8

    R = 5

    Search image

    L = 8

    Score Rank

    T = 2.0

    20

  • 1:N Two universesClosed-set Identification

    The search is known, a priori, to have mate Operationally infrequent E.g. 1:N on a cruise ship E.g. Transport disaster.

    Very common metric in academic tests Unfortunately Explicit dependence on N (i.e. the

    number of students!) Performance metric is

    Rank 1 recognition rate, or more generally

    Cumulative Match Characteristic

    Open-set Identification

    Any given search May have a mate e.g.

    in criminal justice, a recidivist In visa issuance, a shopper

    May not have a mate e.g. In criminal justice, a first-time

    offender In visa issuance, honest

    applicants

    Applies for almost all applications

    Is rarely mentioned in the academic algorithm-development literature

    22

  • 1:N ACCURACY METRICS

    23

  • Recognition Error RatesFalse Positive Identification rate (FPIR) or Type I Error Rate

    false alarm rate reporting that an individual

    is the source of an impression when in fact she is not.

    Blackstones maxim in criminal law that it is better to let ten

    guilty people go free than to falsely convict one innocent person.

    False Negative Identification Rate (FNIR) or Type II Error Rate

    miss rate of reporting that an

    individual is not the source of an impression when in fact she is.

    Airport screening for terrorist failing to identify a terrorist

    who boards an airplane may be of greater concern than false positives.

    25

    PresenterPresentation NotesFalse alarm = 1 specificityHit rate = selectivityDiagnosticityUncertainty in ROC

  • Metrics :: Miss rates

    False Negative Identification Rate (FNIR) aka Miss Rate Complement is the hit rate properly known as the true

    positive identification rate, which is 1 FNIR

    Measured by executing mated searches into an enrolled database of N identities

    FNIR(N, R, T, L) =

    Number of mates outside top R ranks or below threshold Ton candidate list length L

    Number of mated searches conducted

    26

  • Miss rates :: FNIR definition

    3