Effective Computer Science - 頂は礎の上に -

新しい技術の多くは基礎的な技術の上に成り立っています。激動の技術変化に耐えうる体系知識の習得を目的に「基礎と実践の架け橋」となるサイトを目指します。

隠れマルコフモデル(HMM)における基本的な3つのアルゴリズム

観測した単語列の確率を推定

Estimate the probability of an observed sequence. This corresponds to the sum of all the paths producing the observation. It is solved using the forward procedure.
In the specific case of POS tagging, it will determine the probability of the word sequence. Although the forward procedure is not of primary importance here, it is fundamental and has many other applications.

観測された単語列のもっとも確率が高いパスを決定

Determine the most likely path of an observed sequence. This is a decoding problem that is solved using the Viterbi algorithm.

観測した学習データからパラメータを算出

Determine (learn) the parameters given a set of observations. This algorithm is used to build models when we do not know the parameters. It is solved using the forward–backward algorithm.

参考

A tutorial on hidden Markov models and selected applications in speech recognition - Proceedings of the IEEE

Language Processing With Perl and Prolog: Theories, Implementation, and Application (Cognitive Technologies)

Language Processing With Perl and Prolog: Theories, Implementation, and Application (Cognitive Technologies)