PPT - æ°æ®å·¥ç¨ä¸ç¥è¯å·¥ç¨æè²é¨éç¹å®éªå®¤
PPT - æ°æ®å·¥ç¨ä¸ç¥è¯å·¥ç¨æè²é¨éç¹å®éªå®¤
PPT - æ°æ®å·¥ç¨ä¸ç¥è¯å·¥ç¨æè²é¨éç¹å®éªå®¤
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
Comparison between different models <br />
LDA Mixture of Unigram BTM <br />
l Document level topic<br />
distribution <br />
– Suffer sparsity of the doc!<br />
l Model the generation of<br />
each word <br />
– Ignore context <br />
l Corpus level topic<br />
distribution <br />
– Alleviate doc sparsity <br />
l Single topic assumption in<br />
each document <br />
– Too strong assumption <br />
l Corpus level topic<br />
distribution <br />
– Alleviate doc sparsity <br />
l Model the generation of<br />
word pairs <br />
– Leverage context