02.04.2013 Views

Statistical Language Models based on Neural Networks - Faculty of ...

Statistical Language Models based on Neural Networks - Faculty of ...

Statistical Language Models based on Neural Networks - Faculty of ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

[36] R. Kneser, H. Ney. Improved backing-<strong>of</strong>f for m-gram language modeling. In Proceed-<br />

ings <strong>of</strong> the IEEE Internati<strong>on</strong>al C<strong>on</strong>ference <strong>on</strong> Acoustics, Speech and Signal Processing,<br />

1995.<br />

[37] S. Kombrink, M. Hannemann, L. Burget. Out-<strong>of</strong>-vocabulary word detecti<strong>on</strong> and be-<br />

y<strong>on</strong>d, In: ECML PKDD Proceedings and Journal C<strong>on</strong>tent, 2010.<br />

[38] S. Kombrink, T. Mikolov, M. Karafiát, L. Burget. Recurrent <strong>Neural</strong> Network <str<strong>on</strong>g>based</str<strong>on</strong>g><br />

<str<strong>on</strong>g>Language</str<strong>on</strong>g> Modeling in Meeting Recogniti<strong>on</strong>, In: Proceedings <strong>of</strong> Interspeech, 2011.<br />

[39] R. Lau, R. Rosenfeld, S. Roukos. Trigger-<str<strong>on</strong>g>based</str<strong>on</strong>g> language models: A maximum entropy<br />

approach. In Proceedings <strong>of</strong> ICASSP, 1993.<br />

[40] H.-S. Le, I. Oparin, A. Allauzen, J.-L. Gauvain, F. Yv<strong>on</strong>. Structured Output Layer<br />

<strong>Neural</strong> Network <str<strong>on</strong>g>Language</str<strong>on</strong>g> Model. In Proc. <strong>of</strong> ICASSP, 2011.<br />

[41] S. Hochreiter, J. Schmidhuber. L<strong>on</strong>g Short-Term Memory. <strong>Neural</strong> Computati<strong>on</strong>,<br />

9(8):1735-1780, 1997.<br />

[42] S. Legg. Machine Super Intelligence. PhD thesis, University <strong>of</strong> Lugano, 2008.<br />

[43] M. Looks and B. Goertzel. Program representati<strong>on</strong> for general intelligence. In Proc.<br />

<strong>of</strong> AGI 2009.<br />

[44] M. Mah<strong>on</strong>ey. Text Compressi<strong>on</strong> as a Test for Artificial Intelligence. In AAAI/IAAI,<br />

486-502, 1999.<br />

[45] M. Mah<strong>on</strong>ey. Fast Text Compressi<strong>on</strong> with <strong>Neural</strong> <strong>Networks</strong>. In Proc. FLAIRS, 2000.<br />

[46] M. Mah<strong>on</strong>ey et al. PAQ8o10t. Available at http://cs.fit.edu/~mmah<strong>on</strong>ey/<br />

compressi<strong>on</strong>/text.html<br />

[47] J. Martens, I. Sutskever. Learning Recurrent <strong>Neural</strong> <strong>Networks</strong> with Hessian-Free Op-<br />

timizati<strong>on</strong>, In: Proceedings <strong>of</strong> ICML, 2011.<br />

[48] T. Mikolov, J. Kopeck´y, L. Burget, O. Glembek and J. Černock´y: <strong>Neural</strong> network<br />

<str<strong>on</strong>g>based</str<strong>on</strong>g> language models for higly inflective languages, In: Proc. ICASSP 2009.<br />

[49] T. Mikolov, M. Karafiát, L. Burget, J. Černock´y, S. Khudanpur: Recurrent neural<br />

network <str<strong>on</strong>g>based</str<strong>on</strong>g> language model, In: Proceedings <strong>of</strong> Interspeech, 2010.<br />

116

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!