11.07.2015 Views

A fingerprint verification system based on triangular matching and ...

A fingerprint verification system based on triangular matching and ...

A fingerprint verification system based on triangular matching and ...

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

1266 IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 22, NO. 11, NOVEMBER 2000A Fingerprint Verificati<strong>on</strong> SystemBased <strong>on</strong> Triangular Matching <strong>and</strong>Dynamic Time WarpingZsolt Miklo s Kova cs-Vajna, Senior Member, IEEEAbstractÐAn effective <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> <str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g> <str<strong>on</strong>g>system</str<strong>on</strong>g> is presented. It assumes that an existing reference <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> image mustvalidate the identity of a pers<strong>on</strong> by means of a test <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> image acquired <strong>on</strong>line <strong>and</strong> in real-time using minutiae <strong>matching</strong>. The<strong>matching</strong> <str<strong>on</strong>g>system</str<strong>on</strong>g> c<strong>on</strong>sists of two main blocks: The first allows for the extracti<strong>on</strong> of essential informati<strong>on</strong> from the reference imageoffline, the sec<strong>on</strong>d performs the <strong>matching</strong> itself <strong>on</strong>line. The informati<strong>on</strong> is obtained from the reference image by filtering <strong>and</strong> carefulminutiae extracti<strong>on</strong> procedures. The <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> identificati<strong>on</strong> is <str<strong>on</strong>g>based</str<strong>on</strong>g> <strong>on</strong> <strong>triangular</strong> <strong>matching</strong> to cope with the str<strong>on</strong>g deformati<strong>on</strong> of<str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> images due to static fricti<strong>on</strong> or finger rolling. The <strong>matching</strong> is finally validated by Dynamic Time Warping. Results reported <strong>on</strong>the NIST Special Database 4 reference set, featuring 85 percent correct <str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g> (15 percent false negative) <strong>and</strong> 0.05 percent falsepositive, dem<strong>on</strong>strate the effectiveness of the <str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g> technique.Index TermsÐFingerprint, <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> <str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g>, dynamic time warping, <strong>triangular</strong> <strong>matching</strong>, NIST sdb 4.æ1 INTRODUCTIONFINGERPRINTS are produced by graphical flow-like ridges<strong>and</strong> valleys <strong>on</strong> human fingers. Due to their uniqueness<strong>and</strong> immutability, <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g>s are today the most widelyused biometric features. Using current technology, <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g>identificati<strong>on</strong> is in fact much more reliable than otherpossible pers<strong>on</strong>al identificati<strong>on</strong> methods <str<strong>on</strong>g>based</str<strong>on</strong>g> <strong>on</strong> signature,face, or speech al<strong>on</strong>e [1]. Systems <str<strong>on</strong>g>based</str<strong>on</strong>g> <strong>on</strong> mixedbiometric features, for example, <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> combined withface recogniti<strong>on</strong>, naturally improve the reliability at ahigher complexity <strong>and</strong> cost [2]. Although <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g><str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g> is usually associated with criminal identificati<strong>on</strong><strong>and</strong> police work [3], it has become more popular incivilian applicati<strong>on</strong>s, such as access c<strong>on</strong>trol, financialsecurity <strong>and</strong> <str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g> of firearm purchasers, etc. In otherwords, <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g>s are a sort of identity card that peoplecarry with them c<strong>on</strong>tinuously.Since manual <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> <str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g> is extremely tedious<strong>and</strong> time-c<strong>on</strong>suming, automatic <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> identificati<strong>on</strong><str<strong>on</strong>g>system</str<strong>on</strong>g>s are in great dem<strong>and</strong>. The main pointsaddressed in automatic <str<strong>on</strong>g>system</str<strong>on</strong>g>s are: <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> acquisiti<strong>on</strong>,<str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g>, identificati<strong>on</strong>, <strong>and</strong> classificati<strong>on</strong>. The identificati<strong>on</strong>problem is a particular case of ªpoint pattern<strong>matching</strong>º [4], where no border correlati<strong>on</strong> can be used toimprove the <strong>matching</strong> efficiency since the detected minutiaebel<strong>on</strong>g intrinsically to a bidimensi<strong>on</strong>al domain withoutdefined edges.Fingerprint acquisiti<strong>on</strong> is used to obtain the <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g>image in some useful format. There are papers showing. The author is with the Dipartimento di Elettr<strong>on</strong>ica per l' Automazi<strong>on</strong>eÐDEA, University of Brescia, Via Branze 38, 25123 Brescia, Italy.E-mail: zkovacs@ing.unibs.it.Manuscript received 08 July 1999; revised 15 June 2000; accepted 22 Aug.2000.Recommended for acceptance by I. Dinstein.For informati<strong>on</strong> <strong>on</strong> obtaining reprints of this article, please send e-mail to:tpami@computer.org, <strong>and</strong> reference IEEECS Log Number 110203.optical [5] or capacitive [6] sensor-<str<strong>on</strong>g>based</str<strong>on</strong>g> methodologies.The most comm<strong>on</strong> technique is the inked impressi<strong>on</strong>widely used by the police. Optical sensors are comm<strong>on</strong>for automatic <str<strong>on</strong>g>system</str<strong>on</strong>g>s but they are easy to fool if they areused al<strong>on</strong>e. Due to the prism <strong>and</strong> optics, their size cannot bereduced to portable applicati<strong>on</strong>s. Capacitive sensors are<str<strong>on</strong>g>based</str<strong>on</strong>g> <strong>on</strong> the difference between the dielectric c<strong>on</strong>stant ofthe air <strong>and</strong> the human skin. These sensors require touchinga sort of capacitor array <strong>and</strong> since they do not need optics,they can be sufficiently small to be included in portable<str<strong>on</strong>g>system</str<strong>on</strong>g>s, such as cellular ph<strong>on</strong>es or watches.Fingerprint identity <str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g> requires some knowledgeof the pers<strong>on</strong> to be identified. The <str<strong>on</strong>g>system</str<strong>on</strong>g> is asked toaccept or reject the hypothesis of <strong>matching</strong> between a stored<str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> image <strong>and</strong> the test image. C<strong>on</strong>sequently, the<str<strong>on</strong>g>system</str<strong>on</strong>g> must match the pers<strong>on</strong>'s characteristics directlyagainst those stored as reference informati<strong>on</strong>. Identityrecogniti<strong>on</strong> requires further efforts. The <str<strong>on</strong>g>system</str<strong>on</strong>g> must matchthe characteristic of the pers<strong>on</strong> to be recognized against thewhole set of characteristics stored in the database, thusdeciding if <strong>on</strong>e of the items of reference informati<strong>on</strong> issufficiently similar to the <strong>on</strong>e c<strong>on</strong>sidered. Due to the hugenumber of images to be sought, particular techniques areused to create geometric keys for hashing tables [7] <str<strong>on</strong>g>based</str<strong>on</strong>g><strong>on</strong> rotati<strong>on</strong> invariant properties of triangles having minutiaeas vertices. The <str<strong>on</strong>g>system</str<strong>on</strong>g> presented in this paper verifiesthe identity, rather than recognizing the pers<strong>on</strong> <strong>and</strong> is <str<strong>on</strong>g>based</str<strong>on</strong>g><strong>on</strong> minutiae <strong>matching</strong>. Since different countries require adifferent number of minutiae as court evidence, the choice ofminutiae <strong>matching</strong> instead of a more general structure<strong>matching</strong> allows for an easy customizati<strong>on</strong> according to thelocal law when required.The paper is organized as follows: Secti<strong>on</strong> 2 outlines theoverall <str<strong>on</strong>g>system</str<strong>on</strong>g> architecture <strong>and</strong> details its comp<strong>on</strong>ents inthe corresp<strong>on</strong>ding subsecti<strong>on</strong>s; results follow <strong>on</strong> the NISTspecial database 4 [8], a st<strong>and</strong>ard reference <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g>0162-8828/00/$10.00 ß 2000 IEEE


KOV ACS-VAJNA: A FINGERPRINT VERIFICATION SYSTEM BASED ON TRIANGULAR MATCHING AND DYNAMIC TIME WARPING 1267reference minutiae regi<strong>on</strong>s <strong>and</strong> the test image. If somecorresp<strong>on</strong>dences are found, a <strong>triangular</strong> <strong>matching</strong> is applied<strong>and</strong> a possible <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> <strong>matching</strong> is defined. Finally, thistentative <strong>matching</strong> has to be verified by Dynamic TimeWarping [11] to overcome the str<strong>on</strong>g local deformati<strong>on</strong>s. Thislatter <str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g> step corresp<strong>on</strong>ds to the ridge countrequired by a legal <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> <str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g> [3].2.1 Informati<strong>on</strong> Extracti<strong>on</strong> BlockThe procedures of this block are applied to the referenceimage <strong>on</strong>ly. The main result of this processing is the realminutiae set of the <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g>, minimizing the false minutiae<strong>and</strong> maximizing the true <strong>on</strong>es.Fig. 1. Flow-chart of the identificati<strong>on</strong> <str<strong>on</strong>g>system</str<strong>on</strong>g>.database c<strong>on</strong>taining severely deformed <strong>and</strong> difficultimages. Some c<strong>on</strong>clusi<strong>on</strong>s end the paper.2 THE SYSTEMFig. 1 shows the block-diagram of the <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> <strong>matching</strong><str<strong>on</strong>g>system</str<strong>on</strong>g>. The <str<strong>on</strong>g>system</str<strong>on</strong>g> compares two <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> images: areference image <strong>and</strong> a test image. It is assumed that areference image is available for the pers<strong>on</strong> to be identified<strong>and</strong> that it may be processed offline. The test image isobtained <strong>and</strong> used in real-time. It has to be compared to thereference image within a short time compatible with theapplicati<strong>on</strong>.The operati<strong>on</strong> sequence applied to the reference image isshown in the ªInformati<strong>on</strong> Extracti<strong>on</strong> Blockº <strong>and</strong> theeffective <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> <strong>matching</strong> is carried out <strong>on</strong> each testimage, as shown in the ªMatching Block.º The referenceimage is carefully filtered to reduce noise <strong>and</strong> to use thewhole gray-scale dynamics. Since this operati<strong>on</strong> is performed<strong>on</strong>ly <strong>on</strong>ce for each pers<strong>on</strong> to be identified, thefiltering operati<strong>on</strong> speed is not a primary requirement.Minutiae detecti<strong>on</strong> follows. Thanks to the high-qualityimage, the false minutiae at this point are less than 10 percentof the extracted minutiae [9]. The minutiae coordinates <strong>and</strong>the neighboring minutiae list is extracted for each detectedminutiae. A coarse <strong>and</strong> fast filter is applied to the originalreference image <strong>and</strong> the regi<strong>on</strong> of 16 16 pixels aroundeach minutiae point is saved in several rotated versi<strong>on</strong>s tocope with a possible significant global rotati<strong>on</strong>.The test image is acquired by the same device as thereference image. It is filtered by the same coarse filter appliedto the reference image leading to comparable images. Thefiltered test image is scanned by a moving window technique[10] searching for possible corresp<strong>on</strong>dences between the2.1.1 Fine <strong>and</strong> Coarse FilteringTwo different filters are applied to the reference image: afine <strong>and</strong> a coarse <strong>on</strong>e. The test image will be <strong>on</strong>ly coarselyfiltered due to the time c<strong>on</strong>straints.The fine filter should reduce the noise as much aspossible to facilitate the subsequent minutiae extracti<strong>on</strong>operati<strong>on</strong>s. The specific filter used in this work is the <strong>on</strong>eemployed in the PCASYS package [12] built by researchersat NIST. The filter is <str<strong>on</strong>g>based</str<strong>on</strong>g> <strong>on</strong> the bidimensi<strong>on</strong>alFourier transform <strong>and</strong> some n<strong>on</strong>linear operators in thefrequency domain in order to reduce low <strong>and</strong> highfrequency noise. This specific filter was successfullyapplied both to paper-scanned images [8] <strong>and</strong> to thoseacquired by a capacitive sensor [6]. In Fig. 2, a paperscannedimage is shown <strong>on</strong> the left <strong>and</strong> the corresp<strong>on</strong>dingfine-filtered image is shown in the center.The image quality is improved by the coarse filteringprocedure, exp<strong>and</strong>ing the gray-scale dynamics to the wholepixel depth. There are several reas<strong>on</strong>s why the <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g>image does not cover the whole gray-scale dynamics:insufficient c<strong>on</strong>trast or illuminati<strong>on</strong>, uneven ink distributi<strong>on</strong><strong>on</strong> the paper, shallow valleys, dirty fingertips,humidity, etc. When different gray-scale images arecompared, some with str<strong>on</strong>g c<strong>on</strong>trast <strong>and</strong> some without, itis very useful to equalize the signal amplitude, which in thiscase is the gray value. The coarse filter does not cope withthe noise in the frequency domain nor with ridge or valleyinterrupti<strong>on</strong>s. Since this filter is applied both <strong>on</strong> thereference <strong>and</strong> test images, it has to be fast <strong>and</strong> thereforesimple. The coarse filtering procedure c<strong>on</strong>sists of thesubsequent steps:1. A four-neighbor average filter is applied to eachpixel in the image to blur the image, reducing sharplocal variati<strong>on</strong>s.2. The image is split into m n regi<strong>on</strong>s, creatingapproximately 16 16 pixel squares.3. The average, maximum, <strong>and</strong> minimum pixel valuesare computed for each regi<strong>on</strong>.4. A four-neighbor average filter is used to equalize theaverage, maximum, <strong>and</strong> minimum pixel values ofthe whole image.5. The new pixel value is computed for each point inthe image according to the formula:


1268 IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 22, NO. 11, NOVEMBER 2000Fig. 2. Original paper-scanned image <strong>on</strong> the left, the corresp<strong>on</strong>ding finely-filtered image in the center, <strong>and</strong> coarsely-filtered image <strong>on</strong> the right.8p avg>< p > avg : hmx ‡ hmxp n ˆmax avgavg p>: p avg :avg min hmx;where p n is the new pixel value, p is the old pixelvalue, avg, max, <strong>and</strong> min are the average,maximum, <strong>and</strong> minimum pixel values in theregi<strong>on</strong>, <strong>and</strong> hmx is half of the maximum possiblepixel value (hmx ˆ 128 for 8 bit pixels).The rightmost image of Fig. 2 shows the filtered image.In the filtered image, better c<strong>on</strong>trast is evident but also theabove defined regi<strong>on</strong>s are clearly visible. However, theseblocks <strong>and</strong> the background noise have no adverse effectbecause the reference image has already been finely filtered.2.1.2 Minutiae Extracti<strong>on</strong>The minutiae c<strong>on</strong>sidered in the identificati<strong>on</strong> <str<strong>on</strong>g>system</str<strong>on</strong>g>s areridge bifurcati<strong>on</strong>s <strong>and</strong> terminati<strong>on</strong>s. The aim of this procedureis to extract the real 40-60 minutiae of a <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> imagefrom the 500-800 c<strong>on</strong>tained in typical well-filtered, skelet<strong>on</strong>ized,<strong>and</strong> binarized images of NIST sdb 4. Besides classicalmethodologies for removing close minutiae, new approachesare used for deleting spurious minutiae by ridge repair, shortridge eliminati<strong>on</strong>, <strong>and</strong> isl<strong>and</strong> filtering. In particular, a novelalgorithm <str<strong>on</strong>g>based</str<strong>on</strong>g> <strong>on</strong> ridge positi<strong>on</strong>s is employed for bridgefiltering instead of classical methods <str<strong>on</strong>g>based</str<strong>on</strong>g> <strong>on</strong> directi<strong>on</strong>almaps. Finally, two novel criteria validate the endpoints <strong>and</strong>bifurcati<strong>on</strong>s. The details of this methodology are in [9].2.1.3 Neighborhood Extracti<strong>on</strong>In order to obtain correct <strong>and</strong> efficient identificati<strong>on</strong>, a list ofminutiae neighbors is extracted for each minutiae. Theminutiae are ordered according to the increasing distancefrom the image center <strong>and</strong> each list is ordered according tothe increasing distance from the corresp<strong>on</strong>ding minutiae.The image center is assumed to be the mass-center of thedetected minutiae. The minutiae list provides informati<strong>on</strong> <strong>on</strong>the minutiae positi<strong>on</strong> <strong>and</strong> <strong>on</strong> the ridge/valley behaviorbetween minutiae. For this reas<strong>on</strong>, the line c<strong>on</strong>necting eachminutiae <strong>and</strong> <strong>on</strong>e of its neighbors is traced, <strong>and</strong> the grayvalues extracted. In Fig. 3, a coarsely-filtered <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g>image is reported <strong>on</strong> the left, showing the traced linesstarting from a minutiae in the central area. On the right, thegray-scale behavior of two lines is reported: The uppercurve shows the shortest line, while the lower curve reportsthe gray values al<strong>on</strong>g the l<strong>on</strong>gest line. The number of linesto be traced for each minutiae was defined experimentally tobe about 20 <strong>and</strong> they are extracted to make the linedistributi<strong>on</strong> as uniform as possible over the 360 degrees,depending <strong>on</strong> the detected minutiae.2.1.4 Minutiae Regi<strong>on</strong> Rotati<strong>on</strong>Identificati<strong>on</strong> of minutiae in the test image is <str<strong>on</strong>g>based</str<strong>on</strong>g> <strong>on</strong> acomparis<strong>on</strong> between regi<strong>on</strong>s surrounding the minutiae inthe reference image <strong>and</strong> each point in the test image. A16 16 square area is extracted around a minutiae in thereference image <strong>and</strong> it is matched against an area in the testimage having the same shape <strong>and</strong> size. It has to take intoaccount three effects, even if the test image represents thesame <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> as the reference image: it may havedifferent c<strong>on</strong>trast, it may be distorted, <strong>and</strong> it may berotated. The problem of different c<strong>on</strong>trast is solved by thealready defined coarse-filtering procedure. Image distorti<strong>on</strong>is taken into account by defining a suitable threshold whilecomputing the similarity of regi<strong>on</strong>s using the L 1 vectornorm. The relative rotati<strong>on</strong> of the images has to beinvestigated more closely in order to evaluate its effect inthe similarity computati<strong>on</strong>.One hundred minutiae regi<strong>on</strong>s were extracted r<strong>and</strong>omlyfrom different images <strong>and</strong> matched against severalrotated versi<strong>on</strong>s of the same images, computing thedifference between corresp<strong>on</strong>ding image pairs accordingto the method detailed in Secti<strong>on</strong> 2.2.1. The graph of Fig. 4Fig. 3. Minutiae c<strong>on</strong>necting lines traced <strong>on</strong> the <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> image <strong>on</strong> theleft <strong>and</strong> gray value behavior <strong>on</strong> the shortest (upper) <strong>and</strong> l<strong>on</strong>gest (lower)lines <strong>on</strong> the right graphs.


1270 IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 22, NO. 11, NOVEMBER 2000Fig. 5. (a) Reference image with minutiae to search for, (b) test image c<strong>and</strong>idate regi<strong>on</strong>s, (c) zoom <strong>on</strong> the reference minutiae, (d) similiarity mapdrawn <strong>on</strong> test image, <strong>and</strong> (e) zoom <strong>on</strong> test image with the corresp<strong>on</strong>ding regi<strong>on</strong>s.corresp<strong>on</strong>dence. The darker the regi<strong>on</strong>, the better thecorresp<strong>on</strong>dence (meaning a smaller N d value). In thiscase, four c<strong>and</strong>idate regi<strong>on</strong>s are present. Fig. 5e shows thefour c<strong>and</strong>idate regi<strong>on</strong>s superimposed <strong>on</strong> the test image<strong>and</strong> Fig. 5b shows the overall test image. The realcorresp<strong>on</strong>dence is between the reference minutiae <strong>and</strong>regi<strong>on</strong> 3 of the test image. Regi<strong>on</strong> 2 is still a bifurcati<strong>on</strong>,but by looking at the minutiae area <strong>on</strong>ly, it is impossibleto distinguish it from the reference minutiae. Regi<strong>on</strong>s 1<strong>and</strong> 4 are <strong>on</strong>ly similar according to the metrics definingthe corresp<strong>on</strong>dence, but there is no real feature point.Since no minutiae extracti<strong>on</strong> is performed <strong>on</strong> the testimage <strong>and</strong> <strong>on</strong>ly similarity computati<strong>on</strong> is d<strong>on</strong>e withregi<strong>on</strong>s of the reference image, the selecti<strong>on</strong> of theeffective <strong>matching</strong> is postp<strong>on</strong>ed to the subsequentidentificati<strong>on</strong> phase. There is no point in increasing thethreshold for accepting <strong>on</strong>ly more similar regi<strong>on</strong>s because<strong>on</strong> noisy images with shallow ridges or with noise ofsome kind, it would be impossible to define the realmapping.2.2.2 Triangular MatchingIn order to define an effective <strong>matching</strong> technique, adeformati<strong>on</strong> model must be identified. When looking atdifferent images c<strong>on</strong>taining the same <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g>, a str<strong>on</strong>gglobal deformati<strong>on</strong> of the images can be observed. Thisdistorti<strong>on</strong> is due to static fricti<strong>on</strong> <strong>on</strong> a sensor surface, orpaper, or rolling of the finger during ink-<str<strong>on</strong>g>based</str<strong>on</strong>g> registrati<strong>on</strong>techniques. However, small porti<strong>on</strong>s of the images do notappear to exhibit manifest distorti<strong>on</strong>. The idea of how totake into account small local variati<strong>on</strong>s, which may lead tohuge global distorti<strong>on</strong>, is discussed as follows.A pair of triangles forming a square are reported in Fig. 6,in particular in Fig. 6a. By shortening or extending some ofthe edges by less than 10 percent of their original length, thetwo images shown in Fig. 6b or Fig. 6c may be obtained.This small distorti<strong>on</strong> makes the original form easilyrecognizable <strong>and</strong> may be comparable with the localdistorti<strong>on</strong> of a <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> image. In particular, the areaaround a minutiae may be slightly distorted due to staticfricti<strong>on</strong> caused by the fingertip touching a hard surface. Byreplicating the square of Fig. 6 several times, it is possible tocreate a big image. A complete <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> image may bec<strong>on</strong>sidered to c<strong>on</strong>sist of several small pieces, some of themrepresenting minutiae. Fig. 7 <strong>on</strong> the left shows a crossformed from the squares of Fig. 6. On the right, the samecross is shown where the edges of the squares wereshortened or extended by less than 10 percent of theoriginal length. Such a small local distorti<strong>on</strong> produces, forexample, a 6 percent shortening of edge AB <strong>and</strong> aFig. 6. (a) Two triangles forming a square, (b) <strong>and</strong> (c) distorti<strong>on</strong> of thetriangles, where each edge is shortened or extended by less than10 percent of its original length.


KOV ACS-VAJNA: A FINGERPRINT VERIFICATION SYSTEM BASED ON TRIANGULAR MATCHING AND DYNAMIC TIME WARPING 1271Fig. 7. Original image <strong>on</strong> the left, distorted image <strong>on</strong> the right with local deformati<strong>on</strong> less than 10 percent, but c<strong>on</strong>sequent global deformati<strong>on</strong>reaching 45 percent.45 percent extensi<strong>on</strong> of edge GH. This simple (<strong>and</strong>exaggerated) example shows the idea of the model, whichdefines the distorti<strong>on</strong> of <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> images where a smalllocal distorti<strong>on</strong> is amplified <strong>on</strong> a global basis.The following algorithm (written in a pseudo C code) isused to extract a possible minutiae set where the <strong>matching</strong> isestablished. The result is a number of <strong>matching</strong> minutiaepairs <strong>on</strong> the reference <strong>and</strong> test images, which according tothe required number of pairs, forms the basis of theidentificati<strong>on</strong>. By generating a map of already c<strong>on</strong>sideredminutiae, it is possible to significantly reduce the number oftentative <strong>matching</strong> maps <strong>and</strong> employ a very short computati<strong>on</strong>time in this phase.The following list c<strong>on</strong>tains the functi<strong>on</strong>s <strong>and</strong> proceduresused in the algorithm:. GetNextMinutiae() gets the next, already notc<strong>on</strong>sidered, reference minutiae from the ordered setdefined in Secti<strong>on</strong> 2.1.3.. GetNextTestMinutiae(ref) gets the next, alreadynot c<strong>on</strong>sidered, test minutiae corresp<strong>on</strong>ding toref minutiae from the ordered set with decreasingsimilarity between test <strong>and</strong> reference regi<strong>on</strong>s accordingto Secti<strong>on</strong> 2.2.1.. LineLength(ref1, ref2, test1, test2) computesthe length difference between the linesc<strong>on</strong>necting two minutiae in the reference image <strong>and</strong>two in the test image. This procedure may includethe <str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g> phase shown in the next secti<strong>on</strong> toimprove performance.. ComputeAngle(ref1, ref2, test1, test2)computes the angle between two lines c<strong>on</strong>nectingminutiae pair ref1, ref2 <strong>and</strong> pair test1, test2,respectively.. AddMatchedMinutiae(ref1, test1, ref2,test2) adds the pairs of matched minutiae inreference image <strong>and</strong> in test image to the alreadymatched minutiae list.. GetNextMinutiaePair(ref1, test1, ref2,test2) gets the next minutiae pair bel<strong>on</strong>ging to analready matched line both in reference <strong>and</strong> testimages.. GetNextCloseMinutiae(ref1, ref2) gets thenext,already not c<strong>on</strong>sidered reference minutiae fromthe ordered minutiae list minimizing the functi<strong>on</strong>max(L2(ref1, ref3), L2(ref2, ref3)), wheremax() is the larger between two real numbers <strong>and</strong>L2() is the Euclidean norm <strong>on</strong> the bidimensi<strong>on</strong>alCartesian plane.. ComputeTestCoordinates(X, Y, ref3, ref1,test1, ref2, test2) computes the …X; Y † coordinatescorresp<strong>on</strong>ding to ref3 using the linearroto-translati<strong>on</strong> defined by the mapping ref1 >test1, ref2 > test2 according to the formulae:whereX ˆ x 3 ‡ y 3 ‡ x Y ˆx 3 ‡ y 3 ‡ y ; ˆ …X 2 X 1 †…x 2 x 1 †‡…Y 2 Y 1 †…y 2 y 1 †…x 2 x 1 † 2 ‡…y 2 y 1 † 2 ˆ …X 2 X 1 †…y 2 y 1 †…Y 2 Y 1 †…x 2 x 1 †…x 2 x 1 † 2 ‡…y 2 y 1 † 2 x ˆ X 1 x 1 y 1 y ˆ Y 1 y 1 ‡ x 1with the uppercase variables referring to the testcoordinate space <strong>and</strong> the lowercase coordinates tothe reference coordinate space.. OrderTestMinutiae(ref, x, y) orders the testminutiae corresp<strong>on</strong>ding to the reference minutiae refaccording to the increasing distance from point ofcoordinate (x, y) in the test image.. NumberOfMatchedMinutiae() computes thenumber of already matched minutiae.. MaxDistance is the maximum allowable deformati<strong>on</strong>between segments (for example, 10 percent or6 pixels).. MaxAngle allows to c<strong>on</strong>trol the maximum rotati<strong>on</strong>(for example, 20 degrees).. MinimumMatch is the minimum number of requiredminutiae to match (for example, 14) for a valididentificati<strong>on</strong>.


1272 IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 22, NO. 11, NOVEMBER 2000Fig. 8. Matching triangle-<str<strong>on</strong>g>based</str<strong>on</strong>g> graphs: reference image <strong>on</strong> the left, test image <strong>on</strong> the right./* Triangular Matching Algorithm*/:do/* sets two minutiae <strong>on</strong> reference image */ref1 =GetNextMinutiae();ref2= GetNextMinutiae();/* sets the corresp<strong>on</strong>ding two minutiae<strong>on</strong> test image */dotest1 = GetNextTestMinutiae(ref1);dotest2 = GetNextTestMinutiae(ref2);if (test2 != null)dist = LineLength(ref1, ref2, test1,test2);angle = ComputeAngle(ref1, ref2, test1,test2);endif/* the corresp<strong>on</strong>dence is valid ifdeformati<strong>on</strong> issmall <strong>and</strong> rotati<strong>on</strong> is allowed */until (dist < MaxDistance ANDangle < MaxAngle)OR test2 == nulluntil test2 != null OR test1 == nullif (test1 == null OR test2 == null)goto end_NO_MATCH; /* no <strong>matching</strong> can bedefined */endif/* adds minutiae pair to alreadymatched list */AddMatchedMinutiae(ref1, test1, ref2,test2);/* sets third minutiae for each pairof reference <strong>and</strong>test <strong>matching</strong> minutiae pairs according toroto-translati<strong>on</strong> defined bythe minutiae pairs */dovalidpair=GetNextMinutiaePair(&ref1,&test1,&ref2, &test2);if (validpair)doref3 = GetNextCloseMinutiae(ref1, ref2);if (ref3 != null)/* defines third minutiae corresp<strong>on</strong>dingarea <strong>on</strong> test image */ComputeTestCoordinates(&x,&y, ref3,ref1,test1,ref2, test2);OrderTestMinutiae(ref3, x, y);endifdotest3 = GetNextTestMinutiae(ref3);if (test3 != null)dist1 = LineLength(ref1, ref3,test1, test3);dist2 = LineLength(ref2, ref3,test2, test3);angle1 = ComputeAngle(ref1, ref3,test1, test3);angle2 = ComputeAngle(ref2, ref3,test2, test3);endif/* the corresp<strong>on</strong>dence is validif deformati<strong>on</strong> issmall <strong>and</strong> rotati<strong>on</strong> is allowed */until (dist1 < MaxDistance ANDdist2 < MaxDistance ANDangle1 < MaxAngle ANDangle2 < MaxAngle) ORtest3 == nulluntil test3 != null OR ref3 == nullif (ref3 != null AND test3 != null)AddMatchedMinutiae(ref1, test1,ref3, test3);AddMatchedMinutiae(ref2, test2,ref3, test3);endifendifuntil validpair == falseuntil NumberOfMatchedMinutiae() >MinimumMatch/* c<strong>on</strong>tinue with validati<strong>on</strong> procedure of nextsecti<strong>on</strong> */


KOV ACS-VAJNA: A FINGERPRINT VERIFICATION SYSTEM BASED ON TRIANGULAR MATCHING AND DYNAMIC TIME WARPING 1273Fig. 9. Gray-scale line <strong>matching</strong>. (a) Correspinding reference <strong>and</strong> test lines, (b) deformati<strong>on</strong> map, <strong>and</strong> (c) final overlapping of the two lines.The already defined algorithm creates a c<strong>on</strong>nected graph<strong>on</strong> both images, where each graph is composed of triangleswith shared edges. Fig. 8 shows the reference image <strong>on</strong> theleft <strong>and</strong> the test image <strong>on</strong> the right in an intermediate stageof the <strong>triangular</strong> <strong>matching</strong>. The two graphs are shown inwhite, with the already matched minutiae at their vertices.Three <strong>matching</strong> minutiae <strong>and</strong> the c<strong>on</strong>necting lines are alsovisible at the top of the image as dashed black lines. Theupper two black squares indicate the currently extractedminutiae pair <strong>on</strong> the left image, <strong>and</strong> the corresp<strong>on</strong>ding pair<strong>on</strong> the right <strong>on</strong>e. The third pair of black squares <strong>on</strong> the twoimages show the third coordinates, while this new minutiaeis accepted, enlarging the already <strong>matching</strong> minutiae set.This procedure allows for <strong>matching</strong> <strong>on</strong> a local basis, wheredistorti<strong>on</strong> is small, <strong>and</strong> can also identify two <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g>swith c<strong>on</strong>siderable global distorti<strong>on</strong>.Distorti<strong>on</strong> can easily be measured using the two graphs,<strong>on</strong>e <strong>on</strong> the reference image <strong>and</strong> the other <strong>on</strong> the test image,respectively, thanks to the already computed corresp<strong>on</strong>dencebetween single vertices <strong>and</strong> edges of the trianglesthat make up the graphs. A possible measure of thedistorti<strong>on</strong> is the average relative distorti<strong>on</strong> of the corresp<strong>on</strong>dingedges of the triangles. For example, the images inFig. 8 have a relative average distorti<strong>on</strong> of 2.57 percent <strong>and</strong>a maximum distorti<strong>on</strong> of 5.96 percent. Also, it is possible todefine a relative translati<strong>on</strong> <strong>and</strong> rotati<strong>on</strong> of the images.When the images are distorted, translati<strong>on</strong> <strong>and</strong> rotati<strong>on</strong> arenot uniquely defined. Translati<strong>on</strong> may be defined as themovement of the mass-center of the two graphs relative tothe image borders <strong>and</strong> rotati<strong>on</strong> as the relative rotati<strong>on</strong> of thegraphs around the mass-center. For example, the images inFig. 8 are translated by 1.87mm (37 pixels) <strong>and</strong> rotated by5.44 degrees anticlockwise relative to each other.2.2.3 Matching Validati<strong>on</strong>The science of <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> identificati<strong>on</strong> is <str<strong>on</strong>g>based</str<strong>on</strong>g> <strong>on</strong> both the<strong>matching</strong> of a certain number of minutiae <strong>and</strong> the ridgecountbetween minutiae pairs [3]. The previous <strong>matching</strong>phases are useful for <strong>matching</strong> minutiae between thereference <strong>and</strong> test images, but the validati<strong>on</strong> proceduremust be performed before effective <strong>matching</strong> can bec<strong>on</strong>firmed.In Fig. 3, some gray-scale profiles are shown <strong>on</strong> linesc<strong>on</strong>necting minutiae pairs. The validati<strong>on</strong> phase involves<strong>matching</strong> curves like those of Fig. 3 extracted fromcorresp<strong>on</strong>ding regi<strong>on</strong>s <strong>on</strong> the reference <strong>and</strong> test images.Since the minutiae <strong>matching</strong> phase does not c<strong>on</strong>sider globalinformati<strong>on</strong> in the images, <strong>and</strong> is <str<strong>on</strong>g>based</str<strong>on</strong>g> <strong>on</strong>ly <strong>on</strong> the localsimilarity of regi<strong>on</strong>s, gray-scale line <strong>matching</strong> yields globalinformati<strong>on</strong> <strong>and</strong> the final result of the whole identificati<strong>on</strong>procedure. When the images are relatively distorted, thecoordinates of the minutiae are not defined exactly <strong>and</strong> thelines may not refer precisely to the same points in thecorresp<strong>on</strong>ding images. The result is that a physically<strong>matching</strong> line pair do not overlap because of different


1274 IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 22, NO. 11, NOVEMBER 2000Fig. 10. Example of correctly identified <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g>s.image c<strong>on</strong>trast, deformati<strong>on</strong> <strong>and</strong> different starting, orending coordinates. The coarse-filtering technique describedearlier solves the c<strong>on</strong>trast problem but all theremaining problems must still be faced.Fig. 9a shows a pair of <strong>matching</strong> lines extracted from apair of images obtained from the same finger. Thec<strong>on</strong>tinuous line comes from the reference image, the dotted<strong>on</strong>e from the test image. The two lines do not overlap forthe reas<strong>on</strong>s described above <strong>and</strong> a possible integral norm orsome other distance computati<strong>on</strong> technique al<strong>on</strong>e fails inthe recogniti<strong>on</strong> of the <strong>matching</strong>. Indeed, the same ªdistanceºmay also be observed in mis<strong>matching</strong> line pairs.At this point, the main problem is <strong>matching</strong> ofn<strong>on</strong>uniformly phase-displaced curves. The same problemis comm<strong>on</strong> in speech recogniti<strong>on</strong>, where the curves aredefined in the time domain <strong>and</strong> are not uniformlydelayed. This problem is overcome by Dynamic TimeWarping [11], which can also be applied in <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g>validati<strong>on</strong>. C<strong>on</strong>sidering as time the spatial coordinateal<strong>on</strong>g the line c<strong>on</strong>necting the two minutiae, the n<strong>on</strong>lineardisplacement can be identified very easily. Fig. 9b showsthe deformati<strong>on</strong> map obtained by applying the DynamicTime Warping technique to the curve pair of Fig. 9a withthe same c<strong>on</strong>straints <strong>on</strong> local deformati<strong>on</strong> used duringthe <strong>triangular</strong> <strong>matching</strong>. The comm<strong>on</strong> c<strong>on</strong>straints makethe <strong>triangular</strong> <strong>matching</strong> <strong>and</strong> the validati<strong>on</strong> phasesc<strong>on</strong>sistent. The black curve in the allowed gray area ofFig. 9b identifies the n<strong>on</strong>linear expansi<strong>on</strong> which maximizesthe overlapping of the two curves. Fig. 9c showstwo curves, each of which has been distorted according tothe Dynamic Time Warping result in order to minimizethe difference. Applicati<strong>on</strong> of Dynamic Time Warping toeach possibly <strong>matching</strong> line pair allows for validati<strong>on</strong> ofthe lines, i.e., validati<strong>on</strong> of the minutiae they arec<strong>on</strong>nected to.It is important to note that the minutiae <strong>matching</strong> <strong>and</strong> linevalidati<strong>on</strong> methodology presented is statistically veryrobust. It means that since a <str<strong>on</strong>g>system</str<strong>on</strong>g> <str<strong>on</strong>g>based</str<strong>on</strong>g> <strong>on</strong> this techniquemust identify a certain number of validated minutiae in animage pair, for instance, 12 or 16 minutiae, even if some ofthe minutiae may not be correctly matched, the remainingminutiae are still able to validate the identificati<strong>on</strong>. This isextremely important because minutiae-like c<strong>on</strong>figurati<strong>on</strong>smay appear in dirty areas in low-quality images. Theseminutiae may be matched <strong>and</strong> even validated, but they arestatistically insignificant.Fig. 11. Example of correctly identified <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g>s.3 RESULTSResults reported <strong>on</strong> any identificati<strong>on</strong> <str<strong>on</strong>g>system</str<strong>on</strong>g> should be<str<strong>on</strong>g>based</str<strong>on</strong>g> <strong>on</strong> a st<strong>and</strong>ard reference set so that they can becompared with those reported in the literature. Unfortunately,researchers working in this field report theirmeasurements <strong>on</strong> proprietary data even though theyrecognize the importance of st<strong>and</strong>ard reference data <str<strong>on</strong>g>based</str<strong>on</strong>g>results [1]. There are several databases created by NISTc<strong>on</strong>taining <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> images which usually also c<strong>on</strong>tainvery low quality images, such as NIST sdb4, 9, etc.The <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> identificati<strong>on</strong> <str<strong>on</strong>g>system</str<strong>on</strong>g> was validated usingthe reference database ªNIST Special Database 4º [8]. Thisdatabase c<strong>on</strong>sists of 2,000 <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> image pairs (4,000total). The images are obtained by scanning inked <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g>sfrom paper at 500dpi. The images are 8 bit/pixel <strong>and</strong> offixed 512 512 size. The database c<strong>on</strong>tains good qualityimages <strong>and</strong> also image pairs which are severely deformeddue to rolling, inking problems, dirt, or shallow ridges.About 120 image pairs out of the 2,000 (6 percent of the total)could not be matched by a human expert, a retired policeman,without image enhancement techniques, even thoughin these cases, classificati<strong>on</strong> into the five main classes was stillpossible. The <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g>s are mainly centered in the images,but they generally do not cover the whole image <strong>and</strong>background is nearly always present. Thanks to the minutiaeextracti<strong>on</strong> method employed in the <str<strong>on</strong>g>system</str<strong>on</strong>g> [9], imagesegmentati<strong>on</strong> is not needed.Fig. 10 shows a correctly matched <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> pair. In thiscase, 15 minutiae were matched. The relative imagetranslati<strong>on</strong> is 0.76mm <strong>and</strong> the average rotati<strong>on</strong> is 16.9 degrees.The average image distorti<strong>on</strong> is 6.72 percent, whilethe maximum is 15.9 percent. These parameters werecalculated according to the descripti<strong>on</strong> of Secti<strong>on</strong> 2.2.2.Fig. 11 shows a correctly matched <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> pair with14 matched minutiae. The relative image translati<strong>on</strong> is1.00mm <strong>and</strong> the average rotati<strong>on</strong> is 15.8 degrees. Theaverage image distorti<strong>on</strong> is 4.62 percent, while the maximumdistorti<strong>on</strong> is 16.0 percent.The identificati<strong>on</strong>, false negative, <strong>and</strong> false positive rateswere computed using the reference database. The identificati<strong>on</strong>rate is the number of correctly identified image pairsdivided by the total number of pairs (2,000). The falsenegative rate is the complement to <strong>on</strong>e of the identificati<strong>on</strong>rates, i.e., the number of corresp<strong>on</strong>ding image pairs whichhave not been matched with sufficient accuracy divided bythe total number of image pairs. The false positive rate is thenumber of n<strong>on</strong>corresp<strong>on</strong>ding image pairs which were


KOV ACS-VAJNA: A FINGERPRINT VERIFICATION SYSTEM BASED ON TRIANGULAR MATCHING AND DYNAMIC TIME WARPING 1275TABLE 1Fingerprint Verificati<strong>on</strong> Resultsmatched by the <str<strong>on</strong>g>system</str<strong>on</strong>g> as a percentage of the total numberof checked pairs. For the false positive computati<strong>on</strong>,500,000 image pairs were selected r<strong>and</strong>omly from the4,000 <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g>s of the database.The subsequent results were obtained by accepting amaximum relative translati<strong>on</strong> of 7mm, a rotati<strong>on</strong> of25 degrees <strong>and</strong> a maximum average deformati<strong>on</strong> of15 percent: Checking the identified image pairs, themaximum relative translati<strong>on</strong> was 6.75mm, corresp<strong>on</strong>dingto 133 pixels, the maximum absolute rotati<strong>on</strong> was 22 degrees<strong>and</strong> the maximum average distorti<strong>on</strong> was 15 percent withlocal distorti<strong>on</strong>s above 50 percent.The unmatched image pairs were checked manually. The80 percent identificati<strong>on</strong> rate leads to 20 percent falsenegatives. On checking the 20 percent unmatched imagepairs carefully, 30 percent were not recognized even by thehuman expert, while 70 percent were correctly identified.Of this 70 percent, 43 percent had a distorti<strong>on</strong> above thepermitted 15 percent threshold, 13 percent were rotatedmore than 25 degrees <strong>and</strong> in the remaining 44 percent, the<str<strong>on</strong>g>system</str<strong>on</strong>g> failed because it did not find a sufficient number of<strong>matching</strong> minutiae. This porti<strong>on</strong> can be reduced by acceptinga false positive rate: The 85 percent identificati<strong>on</strong> rateleads to 15 percent false negatives <strong>and</strong> to a false positiverate of 0.05 percent. On checking the 15 percent unidentifiedimage pairs, 43 percent were again not recognized by thehuman expert, while 57 percent were correctly identified.Of this 57 percent, 56 percent had a distorti<strong>on</strong> above thepermitted 15 percent, 42 percent were rotated more than25 degrees <strong>and</strong> in the remaining 2 percent, the <str<strong>on</strong>g>system</str<strong>on</strong>g> failedbecause it did not find a sufficient number of <strong>matching</strong>minutiae.The importance of the final <str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g> phase bydynamic time warping is evident measuring the falsepositive <strong>and</strong> negative rates when this block is disabled. Inthis case, the false negative rate remains unchanged since itis mainly due to false minutiae corresp<strong>on</strong>dence detecti<strong>on</strong>.However, the false positive increases from 0.05 percent to10 percent at 15 percent false negative see Table 1).The <str<strong>on</strong>g>system</str<strong>on</strong>g> is currently implemented in software <strong>and</strong>its performance depends <strong>on</strong> the image size <strong>and</strong> <strong>on</strong> themaximum allowed relative rotati<strong>on</strong>. A closer look to theaverage timing of image pairs of 512 512 pixels <strong>on</strong> aP6@233MHz shows that the n<strong>on</strong>linear filter <str<strong>on</strong>g>based</str<strong>on</strong>g> <strong>on</strong> thebidimensi<strong>on</strong>al Fourier transform takes about 3 sec<strong>on</strong>ds,while the coarse filtering needs just 0.2 sec<strong>on</strong>ds. Theminutiae are detected in about 1.2 sec<strong>on</strong>ds. The most timec<strong>on</strong>suming part of the <str<strong>on</strong>g>system</str<strong>on</strong>g> is the corresp<strong>on</strong>dence mapdetecti<strong>on</strong> defined in Secti<strong>on</strong> 2.2.1, which needs 7 sec<strong>on</strong>dsfor an allowed 20 degree rotati<strong>on</strong>. A further sec<strong>on</strong>d isneeded for the <strong>triangular</strong> <strong>matching</strong> <strong>and</strong> final <str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g>.The corresp<strong>on</strong>dence map detecti<strong>on</strong> can be easily implementedin hardware using an already known analog-flashtechnology applied for k-nearest neighbors in opticalcharacter recogniti<strong>on</strong> [13], lowering the overall raw<str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> <strong>matching</strong> time to less than 2 sec<strong>on</strong>ds alsofor 512 512 pixel images [14]. Using a capacitive sensor[6] of 200 200 pixels with a 5 degree rotati<strong>on</strong> limit,the average overall identificati<strong>on</strong> time is below 1 sec<strong>on</strong>d.There are several reas<strong>on</strong>s for this speed up: the imagesize, its background, <strong>and</strong> the fewer minutiae in a smallerfingertip regi<strong>on</strong>.4 CONCLUSIONSA <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> <str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g> <str<strong>on</strong>g>system</str<strong>on</strong>g> has been designed <strong>and</strong>implemented <str<strong>on</strong>g>based</str<strong>on</strong>g> <strong>on</strong> two main blocks: the informati<strong>on</strong>extracti<strong>on</strong> block <strong>and</strong> the <strong>matching</strong> block. The first block isused to extract the informati<strong>on</strong> from reference imagesoffline. The sec<strong>on</strong>d block is used in the <strong>matching</strong> phase<strong>on</strong>line. The minutiae corresp<strong>on</strong>dences are found using a<strong>triangular</strong> <strong>matching</strong> algorithm <strong>and</strong> the final <str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g>uses Dynamic Time Warping. Triangular <strong>matching</strong> is fast<strong>and</strong> overcomes the relative n<strong>on</strong>linear deformati<strong>on</strong> presentin the <str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> image pairs. In fact, <strong>triangular</strong> <strong>matching</strong>saves local regularities <strong>and</strong> compensates for global distorti<strong>on</strong>.The final <str<strong>on</strong>g>verificati<strong>on</strong></str<strong>on</strong>g> <str<strong>on</strong>g>based</str<strong>on</strong>g> <strong>on</strong> Dynamic Time Warpingallows a very low false positive rate to be obtained. The<str<strong>on</strong>g>fingerprint</str<strong>on</strong>g> <strong>matching</strong> rate <strong>and</strong> accuracy is very high, even<strong>on</strong> a difficult image database such as NIST sdb 4. Somestatistics <strong>on</strong> this reference database were also reported.ACKNOWLEDGMENTSThe author wishes to express his appreciati<strong>on</strong> toProfessor R. Guerrieri <strong>and</strong> Dr. A. Ferrari for their help<strong>and</strong> encouragement <strong>on</strong> this work. The author would liketo thank the NIST for providing the PCASYS package.This work was supported by ST-Microelectr<strong>on</strong>ics, whichalso owns the patent [14], <strong>and</strong> it was d<strong>on</strong>e at theDipartimento di Elettr<strong>on</strong>ica, Informatica e SistemisticaÐ-DEIS, University of Bologna, Viale Risorgimento 2, 40136Bologna, Italy.REFERENCES[1] A. Jain, L. H<strong>on</strong>g, <strong>and</strong> R. Bolle, ªOn-Line Fingerprint Verificati<strong>on</strong>,ºIEEE Trans. Pattern Analysis <strong>and</strong> Machine Intelligence, vol. 19, no. 4,Apr. 1997.[2] L. H<strong>on</strong>g <strong>and</strong> A. Jain, ªIntegrating Faces <strong>and</strong> Fingerprints forPers<strong>on</strong>al Identificati<strong>on</strong>,º IEEE Trans. Pattern Analysis <strong>and</strong> MachineIntelligence, vol. 20, no. 12, Dec. 1998.[3] The Science of Fingerprints: Classificati<strong>on</strong> <strong>and</strong> Uses, United StatesDept. Justice, Federal Bureau of Investigati<strong>on</strong>, Washingt<strong>on</strong>, DC,rev. 12±84, 1988.[4] S. Umeyama, ªParameterized Point Pattern Matching <strong>and</strong> ItsApplicati<strong>on</strong> to Recogniti<strong>on</strong> of Object Families,º IEEE Trans. PatternAnalysis <strong>and</strong> Machine Intelligence, vol. 15, no. 2, Feb. 1993.[5] L. Coetzee <strong>and</strong> E. C. Botha, ªFingerprint Recogniti<strong>on</strong> in LowQuality Images,º Pattern Recogniti<strong>on</strong>, vol. 26, no. 10, pp. 1,441±1,460, Oct. 1993.[6] M. Tartagni <strong>and</strong> R. Guerrieri, ªA 390dpi Live Fingerprint ImagerBased <strong>on</strong> Feedback Capacitive Sensing Scheme,º Proc. 1997 IEEEInt'l Solid-State Circuit C<strong>on</strong>f., pp. 200±201, 1997.[7] R.S. Germain, ªLarge Scale Systems,º Biometrics: Pers<strong>on</strong>al Identificati<strong>on</strong>in Networked Soc., A. Jain, R. Bolle, <strong>and</strong> S. Pankanti, eds.,pp. 311±325, 1998.[8] NIST Special Database 4, 8-Bit Gray Scale Images of FingerprintImage GroupsÐ(FIGS), Image Recogniti<strong>on</strong> Group, AdvancedSystems Divisi<strong>on</strong>, Computer Systems Laboratory (CSL), Nat'lInst. St<strong>and</strong>ards <strong>and</strong> Technology, 1992.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!