Hidden Markov Model (HMM)
I. Notations
observation sequence:
hidden state sequence:
number of observations:
parameter sets:
II. Joint and Marginal Distributions
III. Important pre-defined terms:
αk(t)≡P(u1:t, st=k | λ) with initialαk(1)=πkbk(u1) and recursionsαk(t)=(∑Ml=1αl(t−1)alk)bk(ut) βk(t)≡P(ut+1:T | st=k,λ) with initialβk(T)=1 and recursionsβk(t)=(∑Ml=1aklbl(ut+1)βl(t+1)) Lk(t)≡P(st=k, | U,λ) , conditional probability of being in statek at timet with given observationsU .
Lk(t)≡P(st=k, | U,λ)=P(U,st=k | λ)P(U | λ)=P(u1:t,st=k | λ)P(ut+1:T | st=k,λ)P(U | λ)=αk(t)βk(t)P(U | λ) Hk,l(t)≡P(st=k,st+1=l | U,λ) , conditional probability of being in statek at timet and in statel at timet+1 with given observationsU .
Hk,l(t)≡P(st=k,st+1=l | U,λ)=P(U,st=k,st+1=l | λ)P(U | λ)=αk(t)aklbl(ut+1)βl(t+1)P(U | λ)
IV. HMM Problem I:
Compute
V. HMM Problem II:
Compute
then problem II can be written as
with further define
- Initialize
V1(k)=−logπkbk(u1) ∀ 1≤k≤M - Compute
Vt(k) with recursions
Vt(k)=maxl(Vt−1(l)−logalkbk(ut))∀ 1≤k≤M,2≤t≤T - Get the minimum value
W(S)=min1≤k≤MVT(k) - Trace back to find the optimal state path
{sT,sT−1,...,s1}
V. HMM Problem III:
Compute
Baum-Welch algorithm is the EM estimation of HMM parameters
E-Step: under current set of parameters
M-Step: update parameters
+ For Discrete HMM:
- For Gaussian HMM:
π̂ k=Lk(1)∑Mk=1Lk(1)â kl=∑T−1t=1Hk,l(t)∑T−1t=1Lk(t)μ̂ k=∑Tt=1Lk(t)ut∑Tt=1Lk(t)Σ̂ k=∑Tt=1Lk(t)(ut−μk)(ut−μk)T∑Tt=1Lk(t)