-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Queston: is there any examples of inferece with python models written with numpy (intead of pytorch models)? #296
Comments
The main limitation to using non-PyTorch is actually how we receive and send values to the interpreter. We use IValue which is native to PyTorch. If you can get numpy to convert to and from IValue you should be able to accomplish this. You can also add in a converter with our plugin registry https://github.com/pytorch/multipy/blob/main/multipy/runtime/interpreter/plugin_registry.h (and then add in support for the plug in). Eventually we do hope to add a more generic interface for the interpreters. However, due to staffing issues this is a ways away :( |
Thanks for the answer! There is still one thing I'm a little confused about. Intuitively, it seems that a python script can still have numpy dependency. For example, here Does this mean as long as I have numpy package installed, the python interpreters in MultiPy will load them during runtime and still be able to do multi-threading inference? Also, I indeed found that NumPy is somewhat supported by MultiPy. For example, it can be imported with following code multipy/multipy/runtime/test_deploy.cpp Lines 494 to 504 in 19617b9
Given this, I'm wondering why we still need to register the NumPy interface and
as you mentioned above. |
Sorry for closing this, here's the response I made to the reask of the question in #301 Yup this is exactly it. IValue isn't needed for the internals of the interpreter. We just use the type to interact with the interpreters. For numpy we haven't done thorough testing, so we can't provide any guarantees. Though you're right in that things should generally just work (IValue does cover a lot haha just not everything). For the plugins/convertors (the interface I think you're referring to), currently we use IValue as an intermediary to convert a pyobject to something usable in C++. For example on line 501 you go from pyobject->IValue->int. However, eventually we'd like to create a custom convertor get more coverage. Sorry to be more clear if IValue works for your use cases feel free to use it. However, if there are objects which you can't get out of IValue, you'd want to write your own convertor. |
I notice in the introduction that
Also most of the examples provided are with PyTorch. Is there any examples of inferece with python-written models (intead of pytorch models)? For example, can I do inference here with xgboost or lightgbm or simple decision tree written in python with numpy?
The text was updated successfully, but these errors were encountered: