10.07.2015 Views

An Integrated Probabilistic Model for Scan-Matching, Moving Object ...

An Integrated Probabilistic Model for Scan-Matching, Moving Object ...

An Integrated Probabilistic Model for Scan-Matching, Moving Object ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

The graphical model corresponding to Equation 4 is shownin Figure 1. The model consists of two chains, one <strong>for</strong>motion clustering on the left, and one <strong>for</strong> data associationon the right. The two chains are connected using a variablerepresenting the motion estimates of the system, and avariable <strong>for</strong> the observations. The motivation <strong>for</strong> using achain <strong>for</strong> both association and clustering stems from the wayscan data is obtained. The laser scanner obtains data pointssequentially and in a single plane; hence a chain to representthis acquisition. Note that any correlation in the data is notlost as there is a path from any one point in a chain to anyother point.B. InferenceThe inclusion of the variable x RT adds complexity. Firstand <strong>for</strong>emost, rotation and translation are continuous quantitieswhereas the nodes <strong>for</strong> association and clustering arediscrete in nature. This could result in a mixing of sumsand integrals in the inference procedure; in turn leading todifficulties in obtaining a solution. However, as explained inSection III-C, we <strong>for</strong>mulate the problem as a MAP inferenceprocedure defined as:x MAP = argmax p(x | z), (5)xwhere x = 〈x a ,x c ,x RT 〉. Max-product inference operates onthe maxima of the hidden variable; it there<strong>for</strong>e does not sufferfrom any mixed sum integral problem - provided closed <strong>for</strong>msolutions exist. The closed <strong>for</strong>m solution to rotation andtranslation can efficiently be computed by minimising theerror function in Equation 6 [13]:R,T ← argminR,TN∑i=1(z 1,i R+T − z 2,x ai) 2 , (6)where z 1,i is the i-th point in the first scan and z 2,x aiis thepoint in the second scan corresponding to the i-th association.The message passing schedule is, in our case, dictatedby the structure of the graph. In our model, the x a and x cnodes are indirectly linked through the motion estimate node,x RT . This is done <strong>for</strong> practical reasons, as it allows inferenceto run as two chains simultaneously influencing each other.It does require a flooding schedule [11]; each node alwayssends messages to all of its neighbours. This ensures anychange in one chain is always propagated to the other. Inaddition, the schedule visits x RT as every second node in theschedule to facilitate joint reasoning, i.e. a schedule such as:x1 c,xRT ,x2 c,xRT ,...,xN c ,xRT ,x1 a,xRT ,...,xN a .In order to reduce the computational cost of sendingmessages to and from x RT , the message passing algorithmis modified according to the right hand side of Figure 2. Ourmodified belief propagation algorithm treats the value <strong>for</strong>x RT as if it was observed. Outgoing messages m out containthe same in<strong>for</strong>mation as long as we ensure no in<strong>for</strong>mation(belief) is lost due to this change. In order to guarantee this,local features that depend on x RT are recomputed each time anode is visited. <strong>An</strong>y change to either rotation or translationis then incorporated into the outgoing message m out . Themouta/cx iZmmZRTma/cmouta/cx iZmmZ,RTFig. 2. Message passing <strong>for</strong> a single node. Left node shows standard beliefpropagation, right side uses proposed message propagationobservations z do not change, so any change in the messagem Z,RT is solely due to changes in x RT . In order <strong>for</strong> themodified algorithm to behave as if messages are being passedin both directions, the value of x RT needs to be updated inlinewith the message passing schedule. This requires x RT berecomputed after each node has been visited.Algorithm 1 Pseudo-code of the modified inference algorithm.1: x RT ← InitialiseRT()2: <strong>for</strong> iteration = 1 to MaxIterations do3: <strong>for</strong> node = 1 to NumNodes do4: m Z,RT ← ComputeLocalFeature(node,z,x RT )5: <strong>for</strong> nb = 1 to Neighbours(node) do6: m a/c ← CollectIncomingMessage(node,nb)7: m out ← ConstructMessage(node,nb,m Z,RT ,m a/c )8: SendMessage(node,nb,m out )9: end <strong>for</strong>10: x RT ← ComputeRT()11: end <strong>for</strong>12: if CheckConvergence() = true then13: break14: end if15: end <strong>for</strong>16: x RT ← ComputeRT()17: <strong>for</strong> node = 1 to NumNodes do18: m Z,RT ← ComputeLocalFeature(node,z,x RT )19: m a/c ← CollectIncomingMessage(node)20: x a/c ← ComputeState(node,m Z,RT ,m a/c )21: end <strong>for</strong>The resulting algorithm is shown in Algorithm 1. To start,x RT is initialised on line 1 to reflect that it is ’observed’.Belief propagation is per<strong>for</strong>med on lines 2-15. Be<strong>for</strong>e a nodepropagates belief to one of its neighbours it computes its ownbelief (line 4) from the observations z and the value <strong>for</strong> x RT .For each of its neighbours Equation 2 is computed (lines 5-9)and the output message is sent to its neighbour. After sendingmessages to all neighbours, the crucial step of updating x RTis executed on line 10; minimising Equation 6. After eachiteration of the algorithm convergence is checked, only if thealgorithm has converged can it be terminated early. Finally,lines 16-21 compute the state of each node analogous tostandard LBP inference (Equation 3).a/c

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!