Hopefully the following summarizes in a few simple steps how to fit a GARCH process to a time series; gurus (reza, nastradamus, or anyone else who's fit GARCH before) please confirm or correct this:1. Let u(i) = (X(i) - X(i - 1))/X(i-1) be the series of discrete returns of X2. Initialize v(1,w,a,b) = w, set the first variance as the overall variance3. Define v(i,w,a,b) = (1 - a - b)*w + a*[u(i - 1)]^2 + b*v(i - 1,w,a,b)4. Define z_i(w,a,b) = [u(i)/sqrt(v(i,w,a,b))] as the series of "Z-scores" of returns.5. Minimize [(Mean(z_i))^2 + (Stdev(z_i) - 1)^2 + (Skew(z_i))^2] w.r.t. <w,a,b> with constraints (a+b) <= 1, w > 0, a >= 0, b >= 0. In other words, fit the parameters <w,a,b> so that the z-scores are as close as possible to being normally distributed, while disallowing negative values that could cause instability.Not so much trying to set the record for shortest GARCH modeling how-to, but hopefully this is both simple and correct!