Downstream Processing Using Forward-Algorithm HMM

I am trying to implement Forward-Algorithm for a hidden Markov model (HMM), and I ran into an underflow problem when populating an alpha table. I normalized the alpha values ​​using the method described in section 6 here , but now the total sum of the final alpha values ​​(the probability of the observation sequence) is always 1. How can I β€œcancel” the normalization to get the real probability? My implementation is very similar to section 7.2 here .

A recent answer was presented to the same question, but I could not understand the last few steps, and I hope for a more detailed explanation. Thanks!

Update: I think I finally understood the recent answer, but I would appreciate my understanding being correct. Here is what I did (c [k] - odds):

double sum = 0.0; for (i = 0; i < 4; i ++) { // only 4 hidden states sum += alpha[l-1][i]; // sum last column of alpha table (normalized) } double sumLogC = 0.0; for (k = 0; k < l; k++) { sumLogC += Math.log(c[k]); } probability = Math.log(sum) - sumLogC; return probability; 
+4
source share

All Articles