22.08.2015 Views

ATR Theory and ATR Performance Analysis and Prediction

ATR Theory and ATR Performance Analysis and Prediction - ESSRL

ATR Theory and ATR Performance Analysis and Prediction - ESSRL

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>ATR</strong> <strong>Performance</strong><strong>Analysis</strong> <strong>and</strong> <strong>Prediction</strong>Joseph A. O’SullivanOElectronic Systems <strong>and</strong> Signals Research LaboratoryDepartment of Electrical Engineeringjao@eeee.wustl.eduMichael D. DeVore <strong>and</strong> Natalia A. SchmidWashington University in St. LouisSchool of Engineering <strong>and</strong> Applied ScienceSupported by: ARO Center for Imaging Science DAAH 04-9595-10494ONR MURI N00014-9898-1-06-06Boeing Foundation


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001Invitation from Fred Garber“… give ‘invited paper’ addressing the subjectmatter of the day.The subjects of Thursday's session are:<strong>ATR</strong> <strong>Performance</strong> Evaluation, Theoretical Approach to <strong>ATR</strong>,<strong>and</strong> <strong>ATR</strong> <strong>Performance</strong> <strong>Prediction</strong>.”My vision of <strong>ATR</strong> theory <strong>and</strong> <strong>ATR</strong> performance analysis.2


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong>a=T72SARPlatformrTargetClassifierOrientationEstimatorâ=T72^=45°Outline• <strong>ATR</strong> Systems of Interest• Training <strong>and</strong> Testing Paradigm• Some System Design Issues• Information <strong>Theory</strong> <strong>and</strong> <strong>ATR</strong>• System Implementation Issues• ConclusionsModelDatabase3


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001<strong>ATR</strong> Systems of Interesta=T72SARPlatformrTargetClassifierOrientationEstimatorâ=T72^=45°• Imaging Sensor• Problem Definition• Algorithm for– Classification– Parameter estimation• System Resource Constraints– Database size– Processor speed– Communication speeds– ArchitectureModelDatabaseParameters:• Pose• Velocity• “Features”4


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001<strong>ATR</strong> System Design: Training ParadigmTraining DataScene <strong>and</strong> SensorPhysicsParameterExtractionProcessingScoreFunctionInference8.8 x 104 Azimuth (degrees)8.68.48.287.87.6Raw HRRDataSAR Image7.40 50 100 150 200 250 300 350 400Scorefunctionâ=T725


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001<strong>ATR</strong> System Design: Training ParadigmScene <strong>and</strong> SensorPhysicsTraining DataFunctionEstimation• Likelihood functionsparameterized by functions• Training– Function estimation• Inference– Hypothesis testing– Parameter estimationProcessingL(r|a,)Inference8.68.48.287.87.6Raw HRRDataSAR Image8.8 x 104 Azimuth (degrees)7.40 50 100 150 200 250 300 350 400Log-likelihoodfunctionâ=T726


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong>a=T72SARPlatformrTargetClassifierOrientationEstimatorâ=T72^=45°• <strong>ATR</strong> Systems of Interest• Training <strong>and</strong> Testing Paradigm• Some System Design Issues• Information <strong>Theory</strong> <strong>and</strong> <strong>ATR</strong>• System Implementation Issues• ConclusionsModelDatabase7


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001Model-Free versus Model-Based Approaches• Model-Based Approaches– Conditional likelihoods for dataderived from underst<strong>and</strong>ing physicsrp(r|a,• Model-Free Approaches– Processing architecture fixed—no model for data assumed– Examples:» Neural networksr• Intermediate Approaches– Use models when known– Use constrained architectures for rest» MSE on log-magnitudes» MSE on quarter power» Most feature-based classifiersfeaturefp(f|a,8


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001<strong>Performance</strong> <strong>Analysis</strong> <strong>and</strong> <strong>Prediction</strong>• Clear problem statement– Hypothesis testing– Estimation problem• Known distributions– Information bounds» Chernoff, , Rate functions» Fisher Information, CRLB– Laplace approximations– Monte Carlo techniques• Unknown distributions– Minimax bounds• Partially known distributionsAchievable <strong>Performance</strong>Information-TheoreticBounds9


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001Issues in Function Estimation• Statistical Tradeoffs:– Approximation error—Estimation error– Bias—Variance– Overtraining• Learning <strong>Theory</strong> Basis• Current Information-Theoretic View:– Complexity regularization– MDL BasisMoulin, Yu, Barron, Rissanen- LLR(f) + Complexity(f)Log squared error14121086420-2-4Approximation <strong>and</strong> Estimation Error-60 1 2 3 4 5 6 7Complexity: Log DimensionLog integrated squared errorISE=App. Error + Est. ErrorIndividual errors exponentialin dimension10


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001Regularization for Function Estimation 1 2..f SFTikhonovGren<strong>and</strong>er’s SievesPrior LikelihoodsConstraint SetsPenalty FunctionalsComplexity Regularization. . .F 1F 2f 2f 111


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001Robust Conditionally Gaussian ModelModel each pixel as complex Gaussian plus uncorrelated noise:ri 1Ki, aN0p r , a eR ,A i Ki , a N0GLRT Classification <strong>and</strong> MAP Estimation:â Bayes r argmaxamaxkˆp r k ,aˆ r,a argmaxHSˆp r ,a k k2J. A. O’Sullivan O<strong>and</strong> S. Jacobs, IEEE-AES 2000J. A. O’Sullivan, OM. D. DeVore, , V. Kedia, , <strong>and</strong> M. Miller, IEEE-AES to appear 200012


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001Conditionally Gaussian ModelModel each pixel i as independent, zero mean, complex conditionally Gaussianp R ,A,C2 r , a,c 21 c 2 2 i , a eir i2c 2 2 i ,aWhere: i2 (,a)) = variance function over pose <strong>and</strong> classc 2 = constant over all pixels to account for power fluctuationRecognition by maximizing the log-likelihood likelihood ratio *â, ,ĉ ˆ 2 ln pr a,,c 2i argmaxa,,c 2pr 2I i (a,) Where: 2 = average clutter varianceI i = mask function*Schmid & O’SullivanO“ThresholdingMethod for Reduction of Dimensionality”13


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001Normalized Conditionally GaussianResults14


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001<strong>Performance</strong>-Complexity LegendForty combinations of number of piecewise constantintervals <strong>and</strong> training window width15


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001<strong>ATR</strong> <strong>Performance</strong> <strong>and</strong> ComplexityComparison in terms of:• <strong>Performance</strong> achievable at a given complexity• Complexity required to achieve a given performanceInformation <strong>Theory</strong> Basis: Rate-Distortion <strong>Theory</strong> Rate-Recognition <strong>Theory</strong>16


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001Image Segmentation Target ExtractionInformation-Theoretic Approach• Hypothesis Test:– pixels on target vs. on clutter• Pixelwise measure of information for discriminationD(p i ||p 0 ) 2 2 r i rConditionally Gaussian1ln i 2 2 • Segmentation Complexity – Likelihoods on snakes (contours)– Complexity of regionTop 5 Top 50 Top 100 Top 30017


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001System Design Issues:Dynamically Reconfigurable Algorithms• Information <strong>Theory</strong> Contributions– Successively Refinable Models» Effros, , Cover <strong>and</strong> Equitz, Rimoldi» J. Shapiro, Said <strong>and</strong> Pearlman» R. DeVore, , A. Cohen,» I. Daubechies, , D. Donoho, …– Successively Refinable Recognition» Rate-distortion Rate-recognition» Log-time, log-space Rate18


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001Successively-RefinableSensor ModelsDivide azimuth into N d non-overlapping intervals of width d2˜ d,i k , a 1 d2kNd d 22kN d d 2 2 i , adConsider decreasing intervalwidthsd 1 =2, d 2 =, …, d m =2/2m-1•••Search over kin level i ordered by the most likely pose at level i-119


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001Four-Class ExampleSuccessively-refinablesensor models yield successively-refinabledecisions• Eventually, search covers allpossibilities• Breadth-first search quicklyfinds good combinations of (,a)(• Method for modeling targetreflectivity statistics from sampleimages• Target models used to estimateconditional sensor outputstatisticsClassification error as a function of number ofbits passed between the database <strong>and</strong> processor20


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001System Design Issues• <strong>ATR</strong> <strong>Performance</strong>• Refinable Computations• Parallelizable• System ResourceConstraintsResult Quality vs. Complexity21


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001<strong>ATR</strong> as a Parallelizable Operation• Maximizing p R|,A is equivalent to maximizing the log-likelihood, l(r| r|,a) = k + ln p R|,Alr,a ln 2 2r i ,aii 2 i ,a• Each measured value, r i , undergoes operations of thesame form for all pixels, orientations, <strong>and</strong> target classes 22


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001r 1r 2r m<strong>ATR</strong> as a Parallelizable Operation<strong>ATR</strong> a^1<strong>ATR</strong>•••<strong>ATR</strong>a^2a^mmax l(r|, a 1 )max l(r|, a 2 )•••max l(r|, a t )l(r|^1 , a 1 )l(r|^2 , a 2 )l(r|^t , a t )maxa^rg g • • • gg g • • • g• ••• ••• ••g g • • • gl(r|, a)l(r|0,a)l(r|5,a)•••l(r|355,a)max^l(r|,a) 2 (, a)23


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001<strong>ATR</strong> Illustrationa=T72SARPlatformrTargetClassifierOrientationEstimatorâ=T72^=45°For classification/estimation components we relate:• Quality - Probability of erroneous classification• Throughput - Target images processed per second• Resources - Processors, memory <strong>and</strong> I/O b<strong>and</strong>width, etc.24


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001ExampleT 2 =T 1 with prefetch 16 KB/SAR Image (4B floats)1 GHz clock M=10 targetsVarying target model complexity(L templates/target <strong>and</strong> N pixels/template)1 Gb/s Interconnection Network 10 Gb/s Interconnection Network25


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001Conclusions• <strong>ATR</strong> <strong>Performance</strong> Bounds Problem Statement– Information Rate Functions for Detection– Fisher Information for Estimation– Approximation Error—Estimation Estimation Error• Model-Based Approaches:Known Distributions• Successive Refinement• Implementation Considerations26


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001Factor Interrelationships• <strong>ATR</strong> systems are explicitly or implicitly based on models oftargets with some complexity C• More complex target models require more computation but canyield better results; Pr(error)=f(C, SAR )• Target model complexity <strong>and</strong> computational power determineoverall system throughput; T CHIP =h(C, COMP )• Given an architecture, both result quality, , Pr(error), <strong>and</strong>throughput, R=1/ =1/T CHIP , are parameterized by target modelcomplexity27


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001Model complexity resolution inapproximation of 2 (,a)Quality of Results <strong>and</strong> ComplexityCoarse model of aT62 tank,1 template with 16K floatsFine model of a T72 tank (1/5 relative scale),72 templates totaling 1.1M floats28


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001MotivationCombine sensor output prediction with training data<strong>ATR</strong> from CAD ModelsTemplate Based <strong>ATR</strong>Model ExtractionOutline1. Conditionally Gaussian Model for SAR imagery2. Likelihood Based Approach to Recognition3. Target Model Estimation & Segmentation4. Successively-RefinableSensor Models5. Example29


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001Target Model EstimationGiven N registered training images q j of a target with pose j , estimatevariances over N w windows of width dw k 2k d N w 2 ,2k d ˆ 2 k ,a 1 2n k q j where j: j w k N w 2Registered Variance ImagesT 0°T 90° T 180° T 270°Variance estimate for anunregistered image r with pose formed by transforming theestimate from the closest w kTransformed Estimates30


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001Target Model Segmentation• Pixel information relative to null-hypothesis used for target recognition• Retain pixels i that are informative relative to the null-hypothesis:S a i :1 Dp; ˆ 2 i k ,aN wk p; 2 • Segmentation of target models, not of images• Ordering of pixels by their empirical information relative to nullll-hypothesis.Top 5 Top 50 Top 100 Top 300For null-hypothesis 2 =0.0028 - approximate background variance - pixels on illuminatedside of target are deemed most informative.31


<strong>ATR</strong> <strong>Theory</strong> <strong>and</strong> <strong>Performance</strong> O’Sullivan, SPIE SAR 2001Computational ModelsT CHIP 3 LMNPT 1 LMNP T 2 T 3T CHIP sec/SAR ImageL templates/targetT 1 sec/clock cycle M targetsT 2 sec/template memory read N pixels/templateT 3 sec/SAR Image load P processorsAssumptions:Chip processing rate R=1/ =1/T CHIP• Each CPU optimizes over a region of the search space• Multi-issue issue CPU with 2 instructions/clock cycle• 6 instructions per pixel32

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!