Perplexity of n-gram and dependency language models
Perplexity of n-gram and dependency language models
Perplexity of n-gram and dependency language models
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
Language Models – basics<br />
P(s) = P(w 1, w 2, ... w m) ≈ Π i=1..m P(w i | w i-2, w i-1)<br />
In general: Π i=1..m P(w i | h i )<br />
h i … context (history) <strong>of</strong> word w i<br />
12