January 4th, 2011, 5:21 pm
Hello,I'm working with the following C# code to calculate the linear regression for a time series, and I can't figure out how to do it without an intercept. If anyone has any suggestions I would appreciate it a ton.Thankspublic Regression(double[] dblList1, double[] dblList2) { double xAvg = 0; double yAvg = 0; int x = 0; foreach (double dblValue1 in dblList1) { xAvg += dblValue1; yAvg += dblList2[x]; x += 1; } xAvg = xAvg / dblList1.Length; yAvg = yAvg / dblList1.Length; double v1 = 0; double v2 = 0; x = 0; //double[] cTemp = new double[dblList1.Length]; foreach (double dblValue1 in dblList1) { v1 += (dblValue1 - xAvg) * (dblList2[x] - yAvg); v2 += Math.Pow(dblValue1 - xAvg, 2); //cTemp[x] = (v1 / v2) * (dblValue1) + (yAvg - (v1 / v2) * xAvg); x += 1; } a = v1 / v2; b = yAvg - a * xAvg; //b = 0; double[] cTemp = new double[dblList1.Length]; x = 0; foreach (double dblValue1 in dblList1) { cTemp[x] = a * (dblValue1) + b; x += 1; } c = cTemp; printOut = "y = ax + b" + crLf; printOut += "a = " + Math.Round(a, 2) + ", the slope of the trend line." + crLf; printOut += "b = " + Math.Round(b, 2) + ", the intercept of the trend line." + crLf; }