You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to request the addition of a feature to support exporting models to the ONNX (Open Neural Network Exchange) format within the library. As ONNX is becoming a widely adopted format for model interoperability, having support for exporting models to ONNX would significantly enhance the flexibility and usability of the library in various deployment environments.
Why This Feature is Important
Interoperability: ONNX allows models to be transferred across different frameworks such as PyTorch, TensorFlow, and Scikit-learn. By adding ONNX export support, users will be able to seamlessly transition models between these frameworks and utilize them in diverse production environments.
Platform Support: ONNX is supported on various platforms, including cloud services, edge devices, and hardware accelerators. Exporting models in ONNX format would enable deployment on a wider range of devices and systems.
Ecosystem Integration: Many tools and services are built to work with ONNX models, such as the ONNX Runtime, which is optimized for speed and efficiency. Having the ability to export models to ONNX would allow users to integrate with these powerful tools out of the box.
Proposed Solution
Implement a function (e.g., model.to_onnx() or similar) to export the trained models into ONNX format. Or an exporter module.
Ensure compatibility with common ONNX features, such as support for model optimization, quantization, and other transformations.
Provide clear documentation and examples on how to use the new feature effectively.
Ensure that the ONNX export functionality is integrated with the library's test suite, with tests that validate the correctness and performance of the exported ONNX models across different environments.
If this feature aligns with the project's goals, I am open to contributing a bit of time to help write the feature and assist with the implementation.
The text was updated successfully, but these errors were encountered:
Hi @johnnv1! Thanks for requesting an ONNX feature. That sounds super aligned with the project, and I would be glad to collaborate on this feature with you!
I don't have much experience with ONNX, so I wrote a simple example here. Please let me know if you have any feedback. I would love to learn more about it and add ONNX export as a core functionality covered with tests to the library.
Adding a model.to_onnx() sounds good to me, similar to other library APIs such as PyTorch Lightning (model.to_onnx) and Ultralytics (model.export).
Let me know if you can make a feature draft, I can help with scaling the approach to other models, tests and fixing model architectures in case of any issues.
I would like to request the addition of a feature to support exporting models to the ONNX (Open Neural Network Exchange) format within the library. As ONNX is becoming a widely adopted format for model interoperability, having support for exporting models to ONNX would significantly enhance the flexibility and usability of the library in various deployment environments.
Why This Feature is Important
Interoperability: ONNX allows models to be transferred across different frameworks such as PyTorch, TensorFlow, and Scikit-learn. By adding ONNX export support, users will be able to seamlessly transition models between these frameworks and utilize them in diverse production environments.
Platform Support: ONNX is supported on various platforms, including cloud services, edge devices, and hardware accelerators. Exporting models in ONNX format would enable deployment on a wider range of devices and systems.
Ecosystem Integration: Many tools and services are built to work with ONNX models, such as the ONNX Runtime, which is optimized for speed and efficiency. Having the ability to export models to ONNX would allow users to integrate with these powerful tools out of the box.
Proposed Solution
model.to_onnx()
or similar) to export the trained models into ONNX format. Or anexporter
module.If this feature aligns with the project's goals, I am open to contributing a bit of time to help write the feature and assist with the implementation.
The text was updated successfully, but these errors were encountered: