In the simplest case:
Where:
- is your current guess
- is your improved guess
- is the derivative at
Newton's method will get us to a root (where ), or it could shoot off if it reaches a point where and we end up dividing by 0
For Optimization
For optimization we are often trying to find a minimum, not the 0 mark. This is done by doing Newton’s Method on the derivative
EXAMPLE Newton’s method for NLNG Problem Statement. lol
This is looking at deriving Newton’s Optimization in a Multivariate way
Given we have the Optimization problem from Maximum A Posteriori
And we end up with the final goal of
We can first do a Taylor Series expansion about an operating point and a tiny arbitrary movement
Because we want to optimize, we want to move in such a way that we end up at a local minima of the cost function. Hence where
This gives us
Which hence lets us define a rough “movement” to move our operation point such that we reach a minimum
We use to update our operating point
Until we feel that we’ve reached a good enough location ()
Things to note:
- It’ll converge to a minima, but that could be a global minima or more likely a local minima
- The rate of convergence is quadratic
- Hessian needs to be computed, which is hard in practice
