This repository performs parameter estimation & optimization of dynamics systems by solving least-squares problems with numerical optimization methods.
s.t.
There are two approaches to compute the gradient.
- Pontryagin's adjoint method
Compute the gradient by solving the adjoint differential equations. This method provides accurate and efficient gradient estimation; however, deriving the adjoint equations is not easy for general dynamics systems. - Numerical Differentiation
Using finite difference method to estimate the gradient. This approach is easy to implement, but as the number of parameters grows, the computational cost will become unacceptable, and since the numerical differentiation is quite sensitive to noise, using the finite difference method directly will make Quasi-Newton methods fail to converge in some ill-condition problems.
- Some stochastic optimization methods can handle the noisy gradient such as SGDM and Adam, which are two prevalent optimization methods in the deep learning community.
- If the gradient estimation is accurate, the Quasi-Newton method will be much superior to first-order methods.