587968 (3) [Avatar] Offline
#1
In chapters 1-5 the error is always error = (pred - true) ** 2 and delta is always delta = pred - true, ie the goal is always subtracted from the prediction.

In chapter 6 the error is error = (goal_prediction - prediction) ** 2 and delta = prediction - goal_prediction. So now the error is calculated in the opposite way ie the prediction is subtracted from the goal.

The question is does it matter whether you subtract prediction from goal, or subtract goal from prediction?

I have noticed that if the node delta is prediction - true, then the weight delta has to be SUBTRACTED from the weights. This seems to give the derivative because the sign of the weight delta is that of the slope, so you need to subtract it from the weights so they move in the opposite direction of the slope.

However if the node delta is true - prediction, then the weight delta has to be ADDED to the weights. This seems to give the weight delta directly so you just add it to the weights.

Is this fact of any consequence?



Regards
Anthony Perkins