-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[tfjs-react-native] tf.loadGraphModel with bundled files slow #5475
Comments
Any chance of a follow-up? |
@Caundy thank you for the detail report , is the slowness happening for first load or every time , if it happens for first time it is expected and you may need to write a warm up code to optimize it before loading ? |
@rthadur Thanks for responding :) To answer your question, I built the apk, ran it on the Nexus and ran a couple of consecutive model loads using the tf.loadGraphModel method to test the times. The resulting load times are inconsistent, with the first load not taking significantly longer than the later ones. In general I would be aiming to load the model just once per application session and store it in some service property to be used when needed without the necessity of reloading it for each use. That - along with the fact that loading the model blocks users from interacting with the app - is why the initial load time is important to me :) I am aware that the first inference might be much slower than the consecutive ones due to caching that happens on that first run, but does that also apply to loading the model itself? How would I go about writing the code to optimize it before loading it? |
@rthadur I believe the similarities end with both issues using the tfjs-react-native package and bundled model files. I'm not experiencing any issues using the bundleResourceIO method, as that handler resolves successfully and quickly. |
very same issue (the difference is that the graph model NEVER loads), on ios in release mode. in debug mode everything works like a charm. i am using expo. |
Hi, @Caundy Apologize for the delayed response and we're re-visiting our older issues and checking whether those issues got resolved or not as of now so May I know are you still looking for the solution or your issue got resolved ? If issue still persists after trying with latest version of TFJs please let us know with error log and code snippet to replicate the same issue from our end ? Could you please confirm if this issue is resolved for you ? Please feel free to close the issue if it is resolved ? Thank you! |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. Thank you. |
This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you. |
This issue was closed due to lack of activity after being marked stale for past 7 days. |
I‘m facing a similar issue. Is there a solution for the slow android set up? How did you solve it? |
Packages installed
Describe the current behavior
Loading a graph model (13MB) bundled in a react-native application (bare) takes an excessive amount of time on Android devices, taking upwards of 60s.
Model
We have trained an image classification model using Google Vision and exported it as a Tensorflow.js package using their dashboard. After downloading it from the Google Vision dashboard we haven't modified the model in any way.
The model.json file weighs 167KB. The weights are sharded into 4 files (each named group1-shardxof4.bin), three of which weigh 4.2MB and the last 400KB, totalling to 13MB.
The three top lines of the model.json file read:
Init code
As for the model initialization in the application, we make sure that tensorflow is ready by running and awaiting tf.ready() early on in the application's lifecycle and making sure it resolves successfully before loading the model.
In the file where classification happens, we then import the necessary libraries, require the bundled model files and loadGraphModel, such as:
graphModel is later used to create and store an automl.ImageClassificationModel, which isn't relevant here.
The issue
The issue is the tf.loadGraphModel method, which takes upwards of 60s to resolve on slightly older Android devices - such as Nexus 5x - while making the app interface completely unresponsive in the meantime.
Running the application as a built release apk resulted in:
Xiaomi Redmi 7: taking ~24s to load the model,
Nexus 5x: taking ~60s to load the model,
Samsung A10: taking ~78s to load the model.
For comparison:
when ran locally on an iPhone 7 or 8 plus: ~3s to load the model,
when archived, downloaded from Testflight and ran on iPhone7 or 8 plus: ~9s,
when serving the model files from a locally ran node server and loading the model through http using automl.loadImageClassification(modelUrl): 17s to load the model on Nexus 5x.
Describe the expected behavior
I would expect the loadGraphModel method to resolve faster than the reported times.
Given that it's pretty simple, with only ~3k images used to train, but it's hard for me to judge whether the observed load times are reasonable and would love for someone to let me know what kind of load performance could be expected from a 13MB model.
Also, if anyone could point me to how the model load time could be optimized in react-native, any steps that could've been missed and would affect the load time or a better approach to using the Google Vision-trained model (automl) in react-native, I'd greatly appreciate that 🙂
Additionally, is there any proven way of loading the model without completely blocking the js thread while it happens?
Let me know if any additional information would be helpful to resolving the issue 🙂
The text was updated successfully, but these errors were encountered: