Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure ONNX support #1015

Open
johnnv1 opened this issue Dec 26, 2024 · 1 comment
Open

Ensure ONNX support #1015

johnnv1 opened this issue Dec 26, 2024 · 1 comment

Comments

@johnnv1
Copy link

johnnv1 commented Dec 26, 2024

I would like to request the addition of a feature to support exporting models to the ONNX (Open Neural Network Exchange) format within the library. As ONNX is becoming a widely adopted format for model interoperability, having support for exporting models to ONNX would significantly enhance the flexibility and usability of the library in various deployment environments.

Why This Feature is Important

  1. Interoperability: ONNX allows models to be transferred across different frameworks such as PyTorch, TensorFlow, and Scikit-learn. By adding ONNX export support, users will be able to seamlessly transition models between these frameworks and utilize them in diverse production environments.

  2. Platform Support: ONNX is supported on various platforms, including cloud services, edge devices, and hardware accelerators. Exporting models in ONNX format would enable deployment on a wider range of devices and systems.

  3. Ecosystem Integration: Many tools and services are built to work with ONNX models, such as the ONNX Runtime, which is optimized for speed and efficiency. Having the ability to export models to ONNX would allow users to integrate with these powerful tools out of the box.

Proposed Solution

  • Implement a function (e.g., model.to_onnx() or similar) to export the trained models into ONNX format. Or an exporter module.
  • Ensure compatibility with common ONNX features, such as support for model optimization, quantization, and other transformations.
  • Provide clear documentation and examples on how to use the new feature effectively.
  • Ensure that the ONNX export functionality is integrated with the library's test suite, with tests that validate the correctness and performance of the exported ONNX models across different environments.

If this feature aligns with the project's goals, I am open to contributing a bit of time to help write the feature and assist with the implementation.

@qubvel
Copy link
Collaborator

qubvel commented Dec 26, 2024

Hi @johnnv1! Thanks for requesting an ONNX feature. That sounds super aligned with the project, and I would be glad to collaborate on this feature with you!

I don't have much experience with ONNX, so I wrote a simple example here. Please let me know if you have any feedback. I would love to learn more about it and add ONNX export as a core functionality covered with tests to the library.

Adding a model.to_onnx() sounds good to me, similar to other library APIs such as PyTorch Lightning (model.to_onnx) and Ultralytics (model.export).

Let me know if you can make a feature draft, I can help with scaling the approach to other models, tests and fixing model architectures in case of any issues.

Thank you 🤗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants