26.10.2013 Views

Nonparametric Bayesian Discrete Latent Variable Models for ...

Nonparametric Bayesian Discrete Latent Variable Models for ...

Nonparametric Bayesian Discrete Latent Variable Models for ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

4.2 MCMC Sampling algorithms <strong>for</strong> IBLF models<br />

Algorithm 15 Slice sampling <strong>for</strong> stick-breaking IBP<br />

The state of the Markov chain consists of the infinite feature matrix Z, the feature<br />

presence probabilities µ (1:∞) = µ (1), µ (2), . . . corresponding to each feature column<br />

and the set of infinitely many parameters Θ = θ1:∞.<br />

Only the K † columns of Z up to and including the last active column and the corresponding<br />

parameters are represented.<br />

Repeatedly sample:<br />

<strong>for</strong> all rows i = 1, . . . , N do {Feature updates}<br />

Sample a slice s uni<strong>for</strong>mly between 0 and the stick length of the last active component.<br />

if s < µ (K † ) then<br />

Extend the representation by breaking the stick until s > µ (K † ) using eq. (4.51)<br />

Sample parameters <strong>for</strong> the new represented features from the prior<br />

end if<br />

<strong>for</strong> all columns k = 1, . . . , K † do {update features above the slice}<br />

if µ (k) > s, update zik using the full conditional eq. (4.52)<br />

end <strong>for</strong><br />

end <strong>for</strong><br />

<strong>for</strong> all columns k = 1, . . . , K † do {Parameter updates}<br />

Update θk by sampling from its conditional posterior, eq. (4.35)<br />

end <strong>for</strong><br />

<strong>for</strong> all columns k = 1, . . . , K † − 1 do {Update feature presence probabilities}<br />

Update µk by sampling from its conditional posterior, eq. (4.53) using ARS<br />

end <strong>for</strong><br />

<strong>for</strong> column K † , the last represented column, update µ (K † ), by sampling from its<br />

conditional posterior, eq. (4.51)<br />

4.2.6 Change of Representations and Slice Sampling <strong>for</strong> the Semi-Ordered<br />

Stick-Breaking<br />

Both the stick-breaking construction and the standard IBP representation are different<br />

representations of the same nonparametric object. There<strong>for</strong>e, it is possible to make use<br />

of both representations in calculations by changing from one representation to the other<br />

(Teh, Görür, and Ghahramani, 2007). More precisely, given a posterior sample in the<br />

stick-breaking representation it is possible to construct a posterior sample in the IBP<br />

representation and vice versa.<br />

We use the infinite limit <strong>for</strong>mulation of both representations to derive the appropriate<br />

procedures. Note that <strong>for</strong> IBP, the feature labels in an arbitrarily large finite model are<br />

ignored. On the other hand, the stick-breaking construction is obtained by explicitly<br />

representing the feature presence probabilities and en<strong>for</strong>cing an ordering of the feature<br />

indices with decreasing stick lengths. There<strong>for</strong>e, <strong>for</strong> changing from the stick-breaking<br />

to the IBP representation we simply drop the stick lengths as well as all the inactive<br />

93

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!