April 6th, 2013, 2:12 pm
A Markov chain is a stochastic process {X(t)} that takes place in some "state space", with markov transition rules.An HMM is a bivariate markov process {Y(t),X(t)}, where the {X(t)} sequence is markov in itself. Traditionally, inthe HMM, one thinks of the Y(t) as observables and the X(t) as hidden, but the probabilistic structure is the sameregardless, and ditto for the simple Markov chain {X(t)} of the first process.For price prediction (or price description), certainly the world contains lots of hidden states. For example: current volatility in stochastic volatility models. Or maybe {X(t)} is hidden to some people,but not to others, like the current probability of a takeover announcement within the next month. These structures are very general. If you think of the X(t) in the HMM as simply (unknown) parameters,to be estimated from data, then every model in the social sciences and all physical sciences is an HMM of one sort or another. In the (degenerate) case of basic physical law, the X(t) are constant states (parameters), but their value is not known, and must be estimated.So every model is an HMM: even in QED, the most precise physical lawmodel of all time (?), the fine structure constant and masses have to be estimated.Certainly in the so-called Standard Model, there are lots of parameters that need to be fitted. The point is, in HMM's there is a blurry distinction between unknown parameters and hidden states. In GBM, the volatility is a parameter; in SV models it is a hidden state, or if you like, a parameterpromoted to a stochastic parameter process.
Last edited by
Alan on April 5th, 2013, 10:00 pm, edited 1 time in total.