Perplexity of n-gram and dependency language models
Perplexity of n-gram and dependency language models
Perplexity of n-gram and dependency language models
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
Post-n<strong>gram</strong> LM<br />
In general: P(s) = P(w 1 , w 2 , ... w m ) ≈ Π i=1..m P(w i | h i )<br />
h i … context (history) <strong>of</strong> word w i<br />
left-to-right factorization order<br />
Bi<strong>gram</strong> LM: h i = w i-1 (one previous word)<br />
Tri<strong>gram</strong> LM: h i = w i-2 , w i-1 (two previous words)<br />
right-to-left factorization order<br />
Post-bi<strong>gram</strong> LM: h i = w i+1 (one following word)<br />
Post-tri<strong>gram</strong> LM: h i = w i+1 , w i+2 (two following words)<br />
18