-
Notifications
You must be signed in to change notification settings - Fork 179
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Does cppflow::model has support for Tensorflow Lite models ? #238
Comments
I tested cppflow , to load and infer model saved as savedModel . Model loading and prediction time to be slower - taking up to 4 seconds. I try some way to optimize it and I came across the idea of converting the model in to a Tensorflow Lite model which is an optimized FlatBuffer format identified by the .tflite file extension). The conversion can be done following the method from the offical tensorflow page. My issue here is :- I wonder if such model has support in cppflow::model , can it be loaded and infered from? or Is there any tip to get a better speed in inference time , such as the possibility of freezing the model ? any help is very much appreciated ! |
did you know the answer please? |
Hi, as you can see above, I didnt get reply for that issue. I still don't know how to use Lite model in cppflow( perhaps there could be effort towards this , but I don't know any yet). So , I go ahead with using frozen model. In the process , I followed some suggestion given here to solve some problems that came up during loading frozen model. |
No description provided.
The text was updated successfully, but these errors were encountered: