Hidden Markov Model (HMM)

Hidden Markov Model (HMM)

I. Notations

observation sequence: U={ut | t}={u1,u2,...,uT}

hidden state sequence: S={st | t}={s1,s2,...,sT}

number of observations:  T number of states:  M

parameter sets: λ=(π,A,B)

π=(πk),πk=P(s1=k),1lM

A=(akl),akl=P(st+1=l | st=k),1l,kM

B=(bk(u)),bk(u)=P(ut=u | st=k),1kM

II. Joint and Marginal Distributions

P(U,S | λ)=P(S | λ) P(U | S,λ)=πs1bs1(u1)as1,s2bs2(u2)...asT1,sTbsT(uT)

P(U | λ)=SP(U,S | λ)=Sπs1bs1(u1)as1,s2bs2(u2)...asT1,sTbsT(uT)

III. Important pre-defined terms:

  • αk(t)P(u1:t, st=k | λ) with initial αk(1)=πkbk(u1) and recursions αk(t)=(Ml=1αl(t1)alk)bk(ut)

  • βk(t)P(ut+1:T | st=k,λ) with initial βk(T)=1 and recursions βk(t)=(Ml=1aklbl(ut+1)βl(t+1))

  • Lk(t)P(st=k, | U,λ) , conditional probability of being in state k at time t with given observations U .

    Lk(t)P(st=k, | U,λ)=P(U,st=k | λ)P(U | λ)=P(u1:t,st=k | λ)P(ut+1:T | st=k,λ)P(U | λ)=αk(t)βk(t)P(U | λ)
  • Hk,l(t)P(st=k,st+1=l | U,λ) , conditional probability of being in state k at time t and in state l at time t+1 with given observations U .

    Hk,l(t)P(st=k,st+1=l | U,λ)=P(U,st=k,st+1=l | λ)P(U | λ)=αk(t)aklbl(ut+1)βl(t+1)P(U | λ)

IV. HMM Problem I:

Compute P(U | λ) using Forward-Backward algorithm

P(U | λ))=P(u1:t,ut+1:T | λ)=k=1MP(u1:t,st=k,ut+1:T | λ)

=k=1MP(u1:t,st=k | λ)P(ut+1:T | u1:t,st=k,λ)=k=1Mαk(t)βk(t)=k=1Mαk(T)

V. HMM Problem II:

Compute Ŝ =argmaxSP(S | U) using Viterbi algorithm

W(S)logP(S,U | λ)=[logπs1bs1(u1)+t=2Tlogast1,stbst(ut)]

then problem II can be written as
Ŝ =argmaxSP(S | U,λ)=argmaxSP(S,U | λ)P(U | λ)=argmaxSP(S,U | λ)=argminSW(S)

with further define

Vt(t)=mins1,s2,...,st1[logP(s1:t1,st=k,U | λ)]

=maxs1,s2,...,st1[logπs1bs1(u1)+t=2t1logast1,stbst(ut)+logast1,kbk(ut)]
  • Initialize V1(k)=logπkbk(u1)    1kM
  • Compute Vt(k) with recursions
    Vt(k)=maxl(Vt1(l)logalkbk(ut))  1kM,2tT
  • Get the minimum value W(S)=min1kMVT(k)
  • Trace back to find the optimal state path {sT,sT1,...,s1}

V. HMM Problem III:

Compute λ̂ =argmaxλP(u | λ) using Baum-Welch algorithm

Baum-Welch algorithm is the EM estimation of HMM parameters

E-Step: under current set of parameters λold , compute αk(t) , βk(t) , Lk(t) , Hk,l(t)

M-Step: update parameters
+ For Discrete HMM:

π̂ k=Lk(1)Mk=1Lk(1)â kl=T1t=1Hk,l(t)T1t=1Lk(t)b̂ k(j)=Tt=1Lk(t)1ut=jTt=1Lk(t)
  • For Gaussian HMM:
    π̂ k=Lk(1)Mk=1Lk(1)â kl=T1t=1Hk,l(t)T1t=1Lk(t)μ̂ k=Tt=1Lk(t)utTt=1Lk(t)Σ̂ k=Tt=1Lk(t)(utμk)(utμk)TTt=1Lk(t)
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章