September 24th, 2022, 5:20 pm

The function in the post above optimizes the weights/parameters when it is given an objective function that has to be optimized. In my case the objective function is the output of a small independent function that takes some time series parameters.

In a neural network, the objective function will represent the goodness of fit to the target output. In optimizing weights of a neural network, you can also use mean squared error between network output and the target value, but this mean squared error would have to be minimized while I am maximizing my objective function. I want to explain the program in context of neural networks.

Here is the function I am using to calculate the objective function

[ObjOld] = NetworkEvaluationFunction(Xt1,Xt2,Price0,VarLogR,Tl,w1,w2,w3,w4,w5,w6);

.

Here network evaluation function gives the result ObjOld that you can substitute with sum of mean squared errors between neural network calculated values of the target function and actual values of the target function. Xt1, Xt2, Price0, VarLogR, T1 are input arrays that are required to calculate the objective function given the network weights w1, w2, w3, w4, w5, w6.

We want to calculate the weights w1, w2, w3, w4, w5, w6 in a neural network so that ObjFunction/ObjOld is maximized or alternatively any mean squared error/loss function is minimized.

If you have a multi-layered neural network, just add all the various weights on different layers sequentially (possibly even using looping to calculate objective function with all weights sequentially one after the other) in main optimization loop and the algorithm would optimize it well.

Here is the main optimization loop of the program that repeats the optimization over six weights a thousand times.

In my case all parameters were supposed to be between -1 and 1 and therefore I used just one single value of Delta (which is projected change in weights in a single optimization) for all the weights. You will need some idea about the domain on which weights will take values. if domains of various weights are different you can use a different value of Delta proportional to domain and decrease values in larger domain slightly faster.

As I mentioned Delta is the projected change in weights and this starts with large values as if we are evaluating on a coarse grid. I have taken an initial value of Delta =.15 when my domain is for weights is between -1 and 1 but I think an initial value of .2 or .25 might have worked even better in some difficult cases.

We do 1000 iterations of the optimization over weights loop. At the start of the optimization, we set the value of variable Update equal to zero. Whenever even a single weight is updated, the value of update turns true. However, when no value is updates, we know it is time to decrease the step size Delta by a suitable value. I have chosen this value to .8 but please feel free to experiment. After decreasing the value of Delta, the optimization starts again with a smaller step size as if on a slightly finer grid. This way value of optimization step Delta continues to decrease and optimization search for best weights is done on a successively finer grid.

Delta=.15;

for nn=1:1000

Update=0;

w10=w1+Delta;

[ObjNew] = NetworkEvaluationFunction(Xt1,Xt2,Price0,VarLogR,Tl,w10,w2,w3,w4,w5,w6);

if(ObjNew>ObjOld)

w1=w10;

Update=1;

ObjOld=ObjNew;

else

w10=w1-Delta;

[ObjNew] = NetworkEvaluationFunction(Xt1,Xt2,Price0,VarLogR,Tl,w10,w2,w3,w4,w5,w6);

if(ObjNew>ObjOld)

w1=w10;

Update=1;

ObjOld=ObjNew;

end

end

w20=w2+Delta;

[ObjNew] = NetworkEvaluationFunction(Xt1,Xt2,Price0,VarLogR,Tl,w1,w20,w3,w4,w5,w6);

if(ObjNew>ObjOld)

w2=w20;

Update=1;

ObjOld=ObjNew;

else

w20=w2-Delta;

[ObjNew] = NetworkEvaluationFunction(Xt1,Xt2,Price0,VarLogR,Tl,w1,w20,w3,w4,w5,w6);

if(ObjNew>ObjOld)

w2=w20;

Update=1;

ObjOld=ObjNew;

end

end

w30=w3+Delta;

[ObjNew] = NetworkEvaluationFunction(Xt1,Xt2,Price0,VarLogR,Tl,w1,w2,w30,w4,w5,w6);

if(ObjNew>ObjOld)

w3=w30;

Update=1;

ObjOld=ObjNew;

else

w30=w3-Delta;

[ObjNew] = NetworkEvaluationFunction(Xt1,Xt2,Price0,VarLogR,Tl,w1,w2,w30,w4,w5,w6);

if(ObjNew>ObjOld)

w3=w30;

Update=1;

ObjOld=ObjNew;

end

end

w40=w4+Delta;

[ObjNew] = NetworkEvaluationFunction(Xt1,Xt2,Price0,VarLogR,Tl,w1,w2,w3,w40,w5,w6);

if(ObjNew>ObjOld)

w4=w40;

Update=1;

ObjOld=ObjNew;

else

w40=w4-Delta;

[ObjNew] = NetworkEvaluationFunction(Xt1,Xt2,Price0,VarLogR,Tl,w1,w2,w3,w40,w5,w6);

if(ObjNew>ObjOld)

w4=w40;

Update=1;

ObjOld=ObjNew;

end

end

w50=w5+Delta;

[ObjNew] = NetworkEvaluationFunction(Xt1,Xt2,Price0,VarLogR,Tl,w1,w2,w3,w4,w50,w6);

if(ObjNew>ObjOld)

w5=w50;

Update=1;

ObjOld=ObjNew;

else

w50=w5-Delta;

[ObjNew] = NetworkEvaluationFunction(Xt1,Xt2,Price0,VarLogR,Tl,w1,w2,w3,w4,w50,w6);

if(ObjNew>ObjOld)

w5=w50;

Update=1;

ObjOld=ObjNew;

end

end

w60=w6+Delta;

[ObjNew] = NetworkEvaluationFunction(Xt1,Xt2,Price0,VarLogR,Tl,w1,w2,w3,w4,w5,w60);

if(ObjNew>ObjOld)

w6=w60;

Update=1;

ObjOld=ObjNew;

else

w60=w6-Delta;

[ObjNew] = NetworkEvaluationFunction(Xt1,Xt2,Price0,VarLogR,Tl,w1,w2,w3,w4,w5,w60);

if(ObjNew>ObjOld)

w6=w60;

Update=1;

ObjOld=ObjNew;

end

end

if(Update==0)

Delta=Delta*.85;

end

end

In the start of the program, I assign initial starting values to the weights being optimized using the following lines

w1=IValue(1,1);

w2=IValue(1,2);

w3=IValue(1,3);

w4=IValue(1,4);

w5=IValue(1,5);

w6=IValue(1,6);

I would urge friends to try this program on training their neural networks.

If you have a simple enough objective function that can be coded in matlab, and you would like me to optimize it for you, please feel free to email me and I would love to see how my optimization routine works on it.

Also, for friends who have been following my thread, I want to mention that would do more and more work with neural networks in the future and hope to continue to post interesting work and code about machine learning and neural networks.

You think life is a secret, Life is only love of flying, It has seen many ups and downs, But it likes travel more than the destination. Allama Iqbal