Skip to content

Implementation of the experiment mentioned in "Overparameterized Nonlinear Learning: Gradient Descent Takes the Shortest Path?" paper by Samet Oymak and Mahdi Soltanolkotabi using PyTorch

License

Notifications You must be signed in to change notification settings

umityigitbsrn/gd-takes-the-shortest-path-exp-implementation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Overparameterized Nonlinear Learning: Gradient Descent Takes the Shortest Path?

Implementation of the experiment mentioned in "Overparameterized Nonlinear Learning: Gradient Descent Takes the Shortest Path?" paper by Samet Oymak and Mahdi Soltanolkotabi using PyTorch

MNIST Experiment

Implementation of this experiment is under 'mnist_experiment' folder

The experiment setup is taken from the page 12-13 (under 6.2 MNIST Experiment)

To save the resulting JSON files 'misfit_distance_500_json', 'misfit_distance_5000_json' folders need to be created

Low-rank Regression

Implementation of this experiment is under 'low_rank_regression' folder

Gradient for the loss function of low rank regression can be mentioned in the paper on page 32 (in Appendix). The implementation will be based on this notation

The experiment setup is taken from the page 13 (under 6.2 Low-rank Regression)

To save the resulting JSON files 'misfit_distance_json' folder need to be created

About

Implementation of the experiment mentioned in "Overparameterized Nonlinear Learning: Gradient Descent Takes the Shortest Path?" paper by Samet Oymak and Mahdi Soltanolkotabi using PyTorch

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published