[go: up one dir, main page]

Skip to content

Newton’s second-order optimization methods in python

Notifications You must be signed in to change notification settings

dahyun-kang/newoptpy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

39 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NewOptPy

Newton’s second-order Optimization methods in Python


logo

Python implementation of Newton's and Quasi-Newton's second-order optimization methods

firstorder newton

🏫 Disclaimer

This project was carried out as a part of a coursework term project [CSED490Y]: Optimization for Machine Learning @ POSTECH.

📝 Main features

  • Implemented from scratch with minimal dependency (e.g. numpy, matplotlib)
  • Four Newton’s method supported
  • Two-variate convex functions supported
  • Nice visualization

⚙️ Requirements

📌 Quick start

Important arguments

  • optim: choice of optimizer among {firstorder, newton, davidon, dfp, bfgs}.
  • func: choice of two-variate function. The example functions are {paraboloid, skewedparaboloid, steepsidedvalley}.
  • maxiter: int: number of max iteration.
  • stepsize: float: step size (or learning rate)
  • init: two-dimensional coordinate.
  • viz: flag to visualize the landscape.

🏃 Example run:

python main.py --optim dfp \
               --func skewedparaboloid \
               --stepsize 0.002 \
               --init 100 10 \
               --viz

⏰ Time complexity comparison

total steps until convg. computation time per iterations
method s. paraboloid steep valley s. paraboloid steep valley
First-order gradient method 311 4431 0.231 0.267
Vanilla Newton’s method 126 5 0.582 0.549
Davidon’s method 292 4 0.489 0.538
Davidon-Fletcher-Powell method 134 4 0.537 0.747
Broyden-Fletcher-Goldfarb-Shanno 77 4 0.564 0.556