Skip to content
/ chop Public

CHOP: An optimization library based on PyTorch, with applications to adversarial examples and structured neural network training.

License

Notifications You must be signed in to change notification settings

openopt/chop

Repository files navigation

pytorCH OPtimize (CHOP): a library for continuous and constrained optimization built on PyTorch

...with applications to adversarially attacking and training neural networks.

Build Status Coverage Status DOI

⚠️ This library is not actively maintained anymore, and I won't be handling new issues in a timely manner. Contact me if you'd like to contribute. ⚠️

Stochastic Algorithms

We define stochastic optimizers in the chop.stochastic module. These follow PyTorch Optimizer conventions, similar to the torch.optim module. These can be used to

  • train structured models;
  • compute universal adversarial perturbations over a dataset.

Full Gradient Algorithms

We also define full-gradient algorithms which operate on a batch of optimization problems in the chop.optim module. These are used for adversarial attacks, using the chop.Adversary wrapper.

Installing

Run the following:

pip install chop-pytorch

or

pip install git+https://github.com/openopt/chop.git

for the latest development version.

Welcome to chop!

Examples:

See examples directory and our webpage.

Tests

Run the tests with pytests tests.

Citing

If this software is useful to your research, please consider citing it as

@article{chop,
  author       = {Geoffrey Negiar, Fabian Pedregosa},
  title        = {CHOP: continuous optimization built on Pytorch},
  year         = 2020,
  url          = {https://github.com/openopt/chop}
}

Affiliations

Geoffrey Négiar was in the Mahoney lab and the El Ghaoui lab at UC Berkeley at the time this package was developped.

Fabian Pedregosa is at Google Research.

About

CHOP: An optimization library based on PyTorch, with applications to adversarial examples and structured neural network training.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages