Serving the Quantitative Finance Community

 
User avatar
Yair1978
Topic Author
Posts: 0
Joined: June 8th, 2011, 7:30 am

VAR analysis

August 5th, 2011, 3:58 pm

Hi,I'm failing to get what was so innovative of vector autoregressive analysis. Let's say I want to predict a certain variable Y at some future date T, and that to do so I regress it on lags of itself and on another variable X. Now, since what interests me is a future prediction of Y, in any case I would require a future value for X, so I would need a way of predicting X. So I can't just say that X is exogenous. For creating the regression of Y on lags of itself and X I would treat it as such, but I'd have to treat it as indegenous to get an estimate in the future.So, VAR does this by regressing Y on lags of itself and on lags of X, and similarly for X; but anyway I would've have to do something similar, so what's the innovation of VAR analysis?Sincerely,Yair
 
User avatar
ronm
Posts: 0
Joined: June 8th, 2007, 9:00 am

VAR analysis

August 8th, 2011, 6:10 am

When X is exo., it is exo. w.r.t. the system underlying the Y variables. Therefore you may have some control as a policy maker to determine the future value of X (like say you want to fix a certain interest rate to control price index). If you do not have any control on X then, your forecast of Y will be conditional on some value of X.The intuition behind VAR (I think) is to provide a framework to analyse a group of variables (time series variables) simultaneously. Obviously you can analyze them through each individual equation, the parameter estimates will not change whether you have estimated them on full model or on individual equation, at least asymptotically. However there are other many type of analysis which can only be done on full model. Like Impulse response analysis (O-IRF). It needs to factor the estimated VCV matrix of the innovation which can only be done on full model after explicitly defining the inter-relationships of the variables. VAR will just legitimately estimate the parameters........Regards,
 
User avatar
acastaldo
Posts: 14
Joined: October 11th, 2002, 11:24 pm

VAR analysis

October 10th, 2011, 12:05 pm

Quotewhat's the innovation of VAR analysis?According to Karl Whelan:QuoteVar was introduced by Christopher Sims (1980) in a path-breaking article titled ?Macroeconomics and Reality.? Sims was indeed telling the macro profession to ?get real.? He criticized the widespread use of highly specified macro-models that made very strong identifying restrictions (in the sense that each equation in the model usually excluded most of the model?s other variables from the right-hand-side) as well as very strong assumptions about the dynamic nature of these relationships. VARs were an alternative that allowed one to model macroeconomic data accurately, without having to impose lots of incredible restrictions: ?macro modelling without pretending to have too much a priori theory.?And for this, Christopher Sims received 1/2 of the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel 2011.
 
User avatar
SierpinskyJanitor
Posts: 1
Joined: March 29th, 2005, 12:55 pm

VAR analysis

October 10th, 2011, 12:13 pm

the innovation is simply allowing technically uninclined MBA holding risk managers uttering a few seemingly quanty soundbytes about risk. VAR is pretty useless in fact, but like any other benchmark is better than nothing.
 
User avatar
bearish
Posts: 5906
Joined: February 3rd, 2011, 2:19 pm

VAR analysis

October 10th, 2011, 1:45 pm

Uhmmm. This thread is on VAR (vector autoregressive analysis) not VaR (value at risk).
 
User avatar
Aaron
Posts: 4
Joined: July 23rd, 2001, 3:46 pm

VAR analysis

October 12th, 2011, 8:43 pm

Value-at-Risk is extremely useful.Vector Autoregression is not. It was no worse than the large structural models it replaced, but no better. It has never been shown to give useful predictions. The problem is too many parameters to fit. With k parameters and l lags, you need to fit k^2*l coefficients. With 12 economic variables and 4 lags, which is a pretty minimal system, that's 576 values. It generally takes 30 observations per parameter to get useful fits, so that's 17,280 months of data (monthly is the most frequently economic data is generally available). That's 1,440 years over which you have to assume relations are constant and linear. If you cut the number of variables and lags, the model becomes too unrealistic; moreover slightly different choices for variables and lag amount give totally different predictions. Moreover, if you fit on data with adjustments, your model cannot predict next month's value until next month is over. If you use unadjusted data, noise overwhelms signal. And there are intractable numerical problems in selecting parameters.It doesn't even work well in the application for which it was invented, system identification. In that case you have plenty of data since you generally measure with high frequency and precision, and you have some reason to believe the system is approximately linear over some range and stable. Nevertheless, it doesn't work.