Viewing a single comment thread. View all comments

make3333 t1_ivqfpxu wrote

17

Difficult_Ferret2838 t1_ivrnegq wrote

That doesn't mean anything.

−14

make3333 t1_ivroe1x wrote

gradient descent takes the direction of the minimum at the step size according to the taylor series of degree n at that point. in neural nets we do first degree, as if it was a plane. in a lot of other optimization settings they do second order approx to find the optimal direction

10

Difficult_Ferret2838 t1_ivrom17 wrote

>gradient descent takes the direction of the minimum at the step size according to the taylor series of degree n at that point.

No. Gradient descent is first order by definition.

>in a lot of other optimization settings they do second order approx to find the optimal direction

It still isn't an "optimal" direction.

−3