August 15th, 2003, 5:41 am
Hi,Imagine you have a time series, let's say x_1,....,x_n (temperature at 12pm during n days,.. it's a good example now, isn't it). You take your favorite statistic-software (mine is S+) and you run the ACF plot. You see nice bar which indicate you the level of correlation at each lag.My question is the following. In addition to the bars, many ACF plots draw a confidence interval. It is based on the iid-gaussian distribution. It''s an horizontal line at (+/-)1.96/sqrt(n), where n is the length of the time series. But why this confidence interval is the same for all lags (horizontal). For example, when you take lag 1, computing the ACF implies a loss of 1 element in the original time series, hence the number of observations is n-1. For lag k, there is a loss of k elements, so the remaining number is n-k. If you compute the CI using the previous formula, you will see an increase of confidence interval bands.Can someone help me?!