26.10.2013 Views

Nonparametric Bayesian Discrete Latent Variable Models for ...

Nonparametric Bayesian Discrete Latent Variable Models for ...

Nonparametric Bayesian Discrete Latent Variable Models for ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

4.5.4 Conclusions<br />

4.6 Discussion<br />

EBA is a choice model which has correspondences to several models in economics and<br />

psychology. The model assumes the choice probabilities to result from the non-shared<br />

features of the options. The usefulness of the EBA model has been hampered by the lack<br />

of a method that can accurately infer the unknown features of the alternatives. We have<br />

suggested to use an infinite latent feature matrix to represent the unknown features. We<br />

showed empirically that the infinite latent feature model (iEBA) can capture the latent<br />

structure in the choice data as well as the handcrafted model (tEBA). For data <strong>for</strong><br />

which we have less prior in<strong>for</strong>mation it might not be possible to handcraft a reasonable<br />

feature matrix.<br />

Different feature matrices can result in the same choice probabilities and there<strong>for</strong>e are<br />

not distinguished by the model. For example, features that are shared by all options<br />

do not affect the likelihood. Furthermore, only the ratio of the weights affects the<br />

likelihood. What seems to be a non-identifiability problem is not an issue <strong>for</strong> sampling<br />

since we are interested in inferring the choice probabilities, not the ”true” features.<br />

On a more conceptual side, the non-identifiability of the features makes the samples<br />

from the posterior hard to interpret. For instance, <strong>for</strong> the celebrities data one might<br />

have hoped to find feature matrices that correspond to a tree or at least find matrices<br />

with some other directly interpretable structure. However, although the experiment by<br />

Rumelhart and Greeno (1971) was designed with the tree structure in mind Figure 4.16,<br />

we do not know the true latent features that lead to the choices of the subjects. Nevertheless,<br />

we can use the posterior to predict future choices from past data, assess the<br />

similarity of the options and cluster or rank them.<br />

The EBA model does not take into account neither the time ordering of choice outcomes<br />

nor the identities of the choice makers. For some choice scenarios this in<strong>for</strong>mation<br />

is irrelevant or not available. However, a model that can make use of the additional in<strong>for</strong>mation<br />

when available would potentially be more powerful in representing the structure.<br />

This is a direction that the EBA model can be improved.<br />

Another possible improvement of the EBA model would be to take into account the<br />

shared features as well when calculating the choice probabilities. One possibility is to<br />

add to eq. (4.63) the effect of the shared features with a scaling factor s ∈ [0, 1]. The<br />

EBA model would be recovered <strong>for</strong> s = 0, whereas the BTL model would be obtained<br />

when the similar features are treated the same as the differences, that is, with s = 1.<br />

Using a small but nonzero scale value may improve recovering the latent structure of the<br />

alternatives especially when there are pairs of alternatives that have not been compared.<br />

4.6 Discussion<br />

The Indian buffet process has been recently introduced by Griffiths and Ghahramani<br />

(2005), and rapidly attracted interest due to its conceptual simplicity and promising<br />

flexibility <strong>for</strong> defining nonparametric latent feature models. Its relation to the DP and<br />

related distributions has inspired models and inference algorithms from the DP literature<br />

to be adjusted to the IBP models. The different approaches with which the equivalent<br />

111

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!