Bayes Net finds the max expectation given (partial) probability models
Markov Models trying to find the VPI (value of perfect info) - usefulness of revealing each info
- VPI > 0
- VPI(A, B | c) ≠ VPI(A | c) + VPI(B | c)
- VPI(A, B | c) = VPI(A | c) + VPI(A | c, B)
Markov Models
Transition Model: The future is independent of the past given the present. $S_0 \rightarrow S_1 \rightarrow ...$
Stationary Process: The dynamics doesn’t change
- $P(S_t|S_{t-1}) = P(S_{t-1} | S_{t-2})$ is predefined
- Finite number of parameters to define an infinite network
- The probability far away converges: $P_\infty(X)=P_{\infty+1}(X)$
- Solving for converging $P_\infty$: plug in and solve $P(s_1 | s_0) * p+P(s_1 | \neg s_0) *(1-p)=p$
Hidden Markov Models
HMM: Markov model with an observation in each state
Observation Model: Each state can generate its own observation $E_t\leftarrow S_t \rightarrow S_{t+1}$
- $P(E_t|S_t)$ is predefined
- Current evidence is independent of everything else given the current state
(Evidence are not unconditionally independent of each other)
Inference Tasks
Filtering