MMI training next up previous
Next: recognition using a HMM Up: Training of a HMM Previous: ML training

MMI training

To describe the MMI training procedure, we need the appropriate forms of eqns. 1.39 and 1.40.

First consider the clamped case. Since we are working in the sentence level, we form an HMM, tex2html_wrap_inline3190 (by interconnecting the HMMs of speech units) which represents a class l of sentences to which the current observation sequence tex2html_wrap_inline3094 belongs. Then starting from eqn. 1.39 ,

  eqnarray1284

where the second line follows from eqn.1.3.

For the free case, we have only one HMM, tex2html_wrap_inline2942 which represents the whole language. Therefore summing up over all the alternative classes will be equivalent to summing up over the whole set of states. Thus the eqn. 1.40, takes following form,

  eqnarray1300

Now the MMI procedure will be as follows.

(1)
Initialize the each HMM tex2html_wrap_inline3186 with values generated randomly or using an initialization algorithm.

(2)
Take an observation sequence of a sentence and,
(3)
Go to step (2), unless all the observation sequences are considered.
(4)
Repeat step(2) to (3) until a convergence criterion is satisfied.



Narada Warakagoda
Fri May 10 20:35:10 MET DST 1996

Home Page