BBAAQQDDD

BBAAQQDDD t1_ixcvhcy wrote

Maybe a stupid question but I've always wondered how backropagation works. Maybe a stupid question but I've always wondered how backpropagation works. I do not understand how we actually know how z changes with respect to x (where y would be the output) and x a node in some layer. My intuition would be that you know the weight (w) from x to z that you could just say that y = activationfunc(w*x) (of course with a load of other input and weights). So how do you know the amount with which z changes if x changes?

1