Views
5 years ago

Non-local Sparse Models for Image Restoration - Département d ...

Non-local Sparse Models for Image Restoration - Département d ...

where Ai = [αij]j∈Si

where Ai = [αij]j∈Si ∈ Rk×|Si| .WeadoptthesamestrategyasinSection2.2tochoose εiaccordinglytothesizeof Si: εi = σ2F −1 m|Si| (τ). Inthe ℓ1,2-case,thisoptimization problemisconvexandcanbesolvedefficiently[12].Inthe ℓ0,∞case,ontheotherhand,itisintractable,andagreedy approachsuchassimultaneousorthogonalmatchingpursuit [28]mustbeusedtoobtainanapproximatesolution. Intheframeworkoflearnedsparsecoding,adapting D totheimage(s)ofinterestnaturallyleadstothefollowing optimizationproblem min (Ai) n i=1 ,D∈C n� i=1 ||Ai||p,q |Si| p s.t. ∀i � j∈Si ||y j − Dαij|| 2 2 ≤ εi (8) where Disin R m×k withunit ℓ2-normcolumns.Thenormalizationby |Si| p isusedtoensureequalweightsforall groups(asbefore,weonlyconsiderthecaseswhere (p,q) is (1,2)or (0, ∞)). Asnotedintheprevioussection,in classicallearnedsparsecoding,wepreferthe ℓ1normfor learningthedictionaryandthe ℓ0pseudonormforthefinal reconstruction.Weadopthereasimilarchoice:Weusethe convex ℓ1,2normforlearningthedictionary,whichcanbe doneefficientlyusingasimplemodificationof[16],andwe usethe ℓ0,∞pseudo-normforthefinalreconstruction. As in[11],thisformulationallowsalltheimagepatchestobe processedasiftheywereindependentofeachother.Toreconstructthefinalimage,weaveragetheestimatesofeach pixel, x = diag( n� � Rj1m) −1 i=1 j∈Si n� � RjDαij, (9) i=1 j∈Si where RjisdefinedasinEq.(4)and 1misavectorofsize mfilledwithones.Thetermontheleftisascalingdiagonalmatrix,countingthenumberofestimatesforeachpixel. Notethatwhen Si = {i},ourformulationisequivalentto regularlearnedsparsecoding. Atfirstsight,theproposedtechniquemayseemparticularlycostly, sincedecomposingasinglepatchrequires solvingalarge-scaleoptimizationproblem(7). Similar concernsholdfortheoriginalformulationsofnon-local means[3]andBM3D[7].Asinthesecases,slightchanges toourapproacharesufficienttomakeitefficient. 3.3.PracticalFormulationandImplementation Thecomputationalcostoftheoptimizationproblem(8) isdominatedbythecomputationofthevectors αij.Inthe worstcasescenario, n 2 ofthesevectorshavetobecomputed. Weshowintherestofthissectionhowtomodify ouroriginalformulationinordertomakethisnumberlinearin nandallowefficientoptimization. Semi-localgrouping. Whenbuilding Si,onecanrestrict thesearchforpatchessimilarto y itoawindowofsize w × w. Thissemi-localapproachisalsousedin[7],and itreducestheworst-casenumberofvectors αijto nw 2 .In practice,weneveruse wgreaterthan 64inthispaper. Clustering.Itisalsopossibletoclusterpixelsintodisjoint groups Cksuchthatallpixels iin Cksharethesameset Si. Theoptimizationproblems(7)associatedwithallpixelsin thesameclusterareidentical,furtherreducingtheoverall computationalcost: Infact,only nvectors αijarecomputedinthiscasesinceeachpixelbelongstoexactlyonecluster.Thisisakeyingredienttotheefficiencyofourimplementation.Otherstrategiesarealsopossible,allowinga fewclusterstooverlapforinstance. Initializationof D. Oneimportantassetofsparserepresentationsisthattheycanbenefitfromdictionarieslearned offlineonadatabaseofnaturalimages,whichcanbeused asagoodinitialdictionariesforthedenoisingprocedure [11]. Usingtheonlineprocedureof[16],ourinitialdictionariesarelearnedon 2 × 10 7 patchesofnaturalimages takenrandomlyfromthe 10000imagesofthePASCAL VOC’07database.Asshowninthenextsection,usingthis onlineprocedureandsuchalargetrainingsamplehasledto asignificantperformanceimprovementcomparedtomethodssuchas[15]thatusebatchlearningmethodssuchas K-SVD[11]andareunusablewithsuchlarge-scaledata. Improvedmatching.Following[7],wehavenoticedthat bettergroupsofsimilarpatchescanbefoundbyusinga firstroundofdenoisingonthepatches(using,forexample, theclassicalsparsecodingapproachofEq.(3)presented intheprevioussection)beforegroupingthem. Inturn,as shownbyourexperiments,oursimultaneoussparsecoding approachgreatlyimprovesonthisinitialdenoisingstep. Patchnormalization. Toimprovethenumericalstability ofsparsecoding,themeanintensity(orRGBcolor)value ofapatchisoftensubtractedfromallitspixelvaluesbefore decomposingit,thenaddedbacktotheestimatedvalues [11].Wehaveadoptedthisapproachinourimplementation, andourexperimentshaveshownthatitimprovesthevisual qualityoftheresults. Reducingthememorycost.Atfirstsight,Eq.(8)requires storingalargenumberofcodes αij. Eventhoughthese aresparse,andtheirnumbercanbereducedtothenumber ofpixelsusingtheclusteringstrategypresentedabove,this couldpotentiallybeaproblemforlargeimages. Infact, onlyasmallsubsetofthevectors αijisstoredatanygiven time:Theonlineprocedureof[16]computesthemonthe flyanddoesnotrequirestoringthemtolearnthedictionary. InthecaseofEq.(8),themaximumnumberofvectors αij thathavetobestoredatanygiventimeisthesizeofthe largestcluster Ckofsimilarpatches. 3.4.RealImagesandDemosaicking Single-chipdigitalcamerasdonotcaptureanoisyRGB signalateachpixel.Instead,combinedwithared(R),green

(G),orblue(B)filter,thesensorassociatedwitheachpixel integratestheincominglightfluxoverthecorresponding frequencyrangeandashortperiodoftime. Therelation betweenthepixelsandthecolorinformationtheyrecordis obtainedthroughaspecificpattern,themostfamousone beingtheBayerpattern,G-R-G-RonoddlinesandB-G- B-Gonevenones. Thedemosaickingproblemconsists ofreconstructingthewholecolorimagegiventhesensor measurements.Althoughmostoftheapproachesfoundin theliteraturetosolvethisproblemarebasedoninterpolation[13,20,32],theimagemodelsinvestigatedinthispaperhavealsobeenusedfordemosaicking:Self-similarities havebeenexploitedin[4],andlearnedsparsecodinghas beenusedin[15].Weadapthere[15]tooursimultaneous sparsecodingframework. Firstwelearnaninitialdictionary D0using[16]onadatabaseofnaturalcolorimages. Ourdemosaickingprocedurecanthenbedecomposedinto foursimplesteps: (1)Clustersimilarpatchesonthemosaickedimage y. (2)Reconstructeachpatchusing D0,addressingforall i min Ai∈Rk×|Si | ||Ai||0,∞ s.t. ∀j Mj(yj − D0αij) = 0, (10) where MjisabinarymaskedcorrespondingtotheBayer patternofmeasuredvalues,andaveragethereconstructions toobtainanestimate xofthedemosaickedimage. (3) Learn a dictionary D1 for x with a strong regularization—thatis,replace yby xinEq.(8),solving thisequationwithalargevaluefor εi. (4)Reconstructeachpatchusing D2 = [D0 D1]insteadof D0inEq.(10),andaveragetheestimatesusingEq.(9)to obtainthefinaldemosaickedimage. Asshowninthenextsection,thisprocedureoutperforms thestateoftheartfromquantitativeandqualitativepoints ofview. Therawmosaickedsignalofdigitalcamerasin low-light,short-exposuresettingsisnoisy.Itshouldthereforebedenoisedbeforedemosaickingisattempted. Since ourdenoisingprocedureisgenericanddoesnotnecessary assumetheinputdatatobenaturalimages,thedenoising procedurecanbeperformedonthemosaickedimageitself. 4.ExperimentalValidation 4.1.Denoising–SyntheticNoise ExperimentsondenoisingwithsyntheticwhiteGaussiannoisehavecarriedoutwith12standardbenchmarkimages.Theparametersusedinthisexperimentare k = 512, m = 9 × 9for σ ≤ 25, m = 12 × 12for σ = 50and m = 16 × 16for σ = 100. Thevalueof τischosena bitmoreconservativelythanin[15]andissetto 0.8,while ξischosenaccordingtoanempiricalrule, ξ = (32σ) 2 /m forimagesscaledbetween 0and 255,whichhasshownto beappropriateinallofourdenoisingexperimentsforboth realandsyntheticnoise.Peaksignal-to-noiseratio(PNSR) isusedasperformancemeasureinourquantitativeevaluation. 5 Table1reportstheresultsobtainedoneachimage fordifferentvaluesofthe(known)standarddeviationofthe noise σ,andTable2comparestheaveragePSNRonthese imagesobtainedbyseveralstate-of-the-artimagedenoising methods—namelyGSM[22],FoE[24],K-SVD[11]and BM3D[7]—withourmethodinthreesettings:SC(sparse coding)usesafixeddictionarylearnedonadatabaseof naturalimageswithoutgroupingthepatches. Itisthereforesimilartothe globalapproachtodenoisingof[11]. Theonlydifferencesarethatwehaveusedtheonlineprocedureof[16]tolearnthedictionaryfrom 2 × 10 7 naturalimagepatchesinsteadofthe 10 5 patchesusedin[11], andwehaveusedan ℓ1regularizerinsteadofan ℓ0oneto learnthedictionary.Inthesecondsetting(LSC,forlearned sparsecoding),thedictionaryisadaptedtothetestimage, againusingan ℓ1regularizer,whichissimilartotheadaptiveapproachof[11]exceptforour(better)initialdictionaryandtheir ℓ0regularizer. Thelastsetting(LSSC,for learnedsimultaneoussparsecoding)addsagroupingstep andusesthefullpowerofoursimultaneoussparsecoding framework.ThesePSNRcomparisonsshowthatourmodel leadstobetterperformancethanthestate-of-the-arttechniquesingeneral,andisalwaysatleastasgoodasBM3D, thetopperformeramongthose,especiallyforhighvalues of σ.AdditionalqualitativeexamplesaregiveninFigure2. Notethattheparametershavenotbeenoptimizedfor speedbutforqualityintheseexperiments.OnarecentIntel Q94502.66GhzCPU,ittakesforinstance 0.5stodenoise the 256 × 256imagepepperswith σ = 25andthesetting SC, 85swithLSC,and 220swithLSSC.Withparameters optimizedforspeed(k = 256,feweriterationsinthedictionarylearningprocedure),thecomputationtimesbecome respectively 0.25sforSC, 10sforLSC,and 21sforLSC, andthefinalresults’qualityonlydropsby 0.05dB,whichis visuallyimperceptible.Ourframeworkisthereforeflexible intermsofspeed/qualitycompromise. 4.2.Demosaicking WehaveusedthestandardKodakPhotoCDbenchmark toevaluatetheperformanceofourdemosaickingalgorithm. Thisdatasetconsistsof 24RGBimagesofsize 512 × 768 towhichaBayermaskhasbeenapplied. Groundtruthis thusavailable,allowingquantitativecomparisons.Wehave arbitrarilytunedtheparametersofourmethodtooptimize itsperformanceonthe 5lastimages,choosing k = 256 (dictionarysize), m = 8 × 8(patchsize),and ξ = 3 × 10 4 (forimagesscaledbetween0and255). Theseparameters 5 Denoting by MSE the mean-squared-error for images whose intensities are between 0 and 255, the PSNR is defined as PSNR = 10 log 10 (255 2 /MSE)andismeasuredindB.Againof 1dBreducesthe MSEbyapproximately 20%.

Desingularization of non local dynamic models by means of ...
Local and Non-Metric Similarities Between Images ... - Pixel shaker
Free True Confessions of Nude Photography: A Step-By-Step Guide to Recruiting Beautiful Models, Lighting, Photographing Nudes, Post-Processing Images, and Maybe Even Getting Paid to Do It (3rd Edition) | Download file
Centralized Sparse Representation for Image Restoration
Carved Visual Hulls for Image-Based Modeling - Département d ...
Online Dictionary Learning for Sparse Coding - Département d ...
Effective Separation of Sparse and Non-Sparse Image Features for ...
Image restoration by sparse 3D transform-domain ... - CiteSeerX
Non Local Image Restoration Using Orientation Optimization By ...
IMAGE RESTORATION USING A SPARSE QUADTREE ...
Sparse Coding Models of Natural Images: Algorithms for Efficient ...
Image Interpolation, Restoration, and Vector Set Models
VARIATIONAL RESTORATION OF NON-FLAT IMAGE FEATURES ...
Image and Video Restoration via Non-Local Kernel Regression
Sparse kernel regression modeling using combined locally ...
Predictive spatio-temporal models for spatially sparse ... - IMAGe
Sparse PDF Maps for Non-Linear Multi-Resolution Image Operations
Restoration of images corrupted by additive non ... - IEEE Xplore
Sparse Models for Adversarial Learning
Shape Reconstruction and Image Restoration for Non-Flat Surfaces ...
Local Image Reconstruction and Sub-Pixel Restoration Algorithms
Training sparse natural image models with a fast Gibbs sampler of ...
Restoring Warped Document Images through 3D Shape Modeling
Image Deblurring and Super-resolution by Adaptive Sparse Domain ...
Adaptive image restoration using a generalized Gaussian model for ...
Image Deblurring and Super-Resolution by Adaptive Sparse ...
Image Restoration and Reconstruction & Color Imaging and ...
Non-negative Local Coordinate Factorization for Image ... - People
Adaptive LASSO for sparse high-dimensional regression models