Receive a 20% Discount on All Titles Ordered Through IGI Global's Online Bookstore Including All Electronic Resources
Ideal for Online Learning Environments During the COVID-19 Pandemic

View All Titles

View All Titles

Copyright: © 2015
|Pages: 13

DOI: 10.4018/978-1-4666-8315-0.ch008

Chapter Preview

TopIn order to accurately analyze the HMMs (Eddy, 2004), we propose to apply the formalized DTMC to formally define HMMs and verify their properties in higher-order logic as the extended DTMC models.

An HMM is a pair of two stochastic processes { Xk; Yk } k ≥ 0, where { Xk } k ≥ 0 is a Markov chain, and { Yk } k ≥ 0 is conditionally independent of { Xk }, i.e., Yk depends only on Xk and not on any Xt, such that t ≠ k. The HMMs model situations where an experimenter sees some observers at every instant (mathematically represented by Yk) and suspects these observables to be the outcome of a process that can be modeled by a Markov chain ({ Xk } k ≥ 0). The name “Hidden Markov Model” arises from the fact that the state in which this model is at a particular instant is not available to the observer. Now, a HMM is defined as a parameterized triple (A, B, π (0)) with the following conditions:

*1.*Hidden Markov Chain { Xk } k ≥ 0 with a finite state space S, the initial distribution π (0) = { π i (0) } i є S and the transition probabilities A = { a

_{ij}} i є S, j є S.*2.*A random process { Yk } k ≥ 0 with finite state space O. The hidden Markov chain and the random process are associated with the emission probabilities B = { bj (Ok) } j є S,k є O = { Pr { Yn = Ok | Xn = j } } j є S, k є O. It implies that:

*a.*∀ j k. bj (Ok) ≥0,

*b.*∑ k є Obj (Ok) = 1.

*3.*The random process { Yk } k ≥ 0 and hidden Markov chain { Xk } k ≥ 0 have conditional independence.

This yields the formalization (Liu, 2013):

**Definition 8.1**

⊢ ∀ X Y p s_{X}s_{Y}p_{0}p_{ij}p_{XY}. hmm X Y p s_{X}s_{Y}p_{0}p_{ij}p_{XY}= dtmc X p s_{X}p_{0}p_{ij}∧ (∀t. random variable (Y t) p s_{Y}) ∧ (∀ i. i ∈ space s_{Y}{ i } ∈ subsets s_{Y}) ∧ (∀ t a i. ℙ { x | X t x = I } ≠ 0 ⇒ ℙ ({ x | Y t x = a } | { x | X t x = I }) = p_{XY}t a I) ∧ ∀ t a i tx0 ty0 sts_{X}sts_{Y}ts_{X}ts_{Y}. t ∉ { tx0 + m | m ∈ ts_{X}} ∧ t ∉ { ty0 + m | m ∈ ts_{Y}} ∧ ℙ ({ x | X t x = I } ∩k є t s_{X}{ x | X (tx0 + k) x = EL k sts_{X}} ∩ k є t s_{Y}{ x | Y (ty0 + k) x = EL k sts_{Y}}) ≠ 0 ⇒ ℙ ({ x | Y t x = a } | ({ x | X t x = I } ∩ k є t s_{X}{ x | X (tx0 + k) x = EL k sts_{X}} ∩ k є t s_{Y}{ x | Y (ty0 + k) x = EL k sts_{Y}})) = ℙ ({ x | Y t x = a } | { x | X t x = I })

In this definition, the variable X denotes the random variable of the underlying DTMC (as the first conjunct constrains), Y indicates the random observations (so Y t is a random process as the second condition describes), and pXY indicates the emission probabilities, i.e., the probability of obtaining a particular value for Y depending on the state X. Like the second condition in Definition 6.3, the condition (∀ i. i ∈ space s_{Y} ⇒ { I } ∈ subsets s_{Y}) ensures that the event space is a discrete space. The conjunct (∀ t a i. ℙ { x | X t x = I } ≠ 0 ⇒ ℙ ({ x | Y t x = a } | { x | X t x = I }) = p_{XY} t a I) assigns the function p_{XY} to emission probabilities under the condition ℙ { x | X t x = I } ≠ 0, which ensures that the corresponding conditional probabilities are well-defined. The non-trivial conjunct in the above definition is the last one which formalizes the notion of conditional independence mentioned above. In our work, we consider mainly discrete time and finite-state space HMMs, which is the most frequently used case.

Time-homogenous HMMs can also be formalized in a way similar to time homogenous DTMCs (Liu, 2013). Note that, in practice, time-homogenous HMMs always have a finite state-space (Liu, Aravantinos, Hasan & Tahar, 2014).

**Definition 8.2**

Search this Book:

Reset

Copyright © 1988-2020, IGI Global - All Rights Reserved