Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

frame drop while running imagenet.predict #67

Closed
micuat opened this issue Feb 25, 2018 · 4 comments
Closed

frame drop while running imagenet.predict #67

micuat opened this issue Feb 25, 2018 · 4 comments

Comments

@micuat
Copy link
Contributor

micuat commented Feb 25, 2018

I noticed during the workshop last week that p5.js canvas frame rate drops significantly while running video classification. Here's an example output from this code with console.log(frameRate()) in draw().

44.2477876172749
62.11180126747634
58.139534876795196
60.240963845789416
4.321521175375056
14.684287814305357
59.52380948074774
61.72839506855966
58.479532172979155
59.171597624902525
54.34782608448077
65.78947370859808
60.240963845789416
59.8802395465879
60.97560972744388
7.41839762612301
64.93506491876796
60.975609781548265
56.17977527634279
63.29113926550177
62.499999987267074
56.17977527634279
60.606060613757485
60.97560972744388
57.803468233998
65.78947364561368
56.497175151796085
63.29113926550177
60.606060613757485
58.47953212321371
58.479532172979155
57.14285713715517
63.69426756050796
62.8930817669005
59.171597624902525

imagenet.predict(...) happens asynchronously but I think deeplearn.js is using GPU, which is blocking canvas rendering. Is this a technical limitation, or can we somehow continue updating the canvas while classification job is running?

@micuat micuat changed the title frame drop while running frame drop while running imagenet.predict Feb 25, 2018
@cvalenzuela
Copy link
Member

The video classification example is an intense task to the browser. I'm not 100% sure, but I think we might get past this using web workers.

@Versatilus
Copy link

Versatilus commented Feb 26, 2018 via email

@dariusk
Copy link
Contributor

dariusk commented Feb 27, 2018

I profiled the given code and at least on my Macbook Pro it does seem like the rendering is being blocked somehow in the ~100ms after a predict is fired.

image

The big spike on the left is the predict() call stack, and then we can see that the Animation Frame call in the middle is taking ~76 ms due to drawImage taking ~75ms (it usually only takes ~1ms). Same with the readPixels call. It definitely seems like predict is causing ordinary tasks to choke up, which I can totally believe from the monstrous size of the call stack! It seems like this is the kind of thing that would require a GPU-experienced programmer to make upstream changes to deeplearn.js to try and do some math magic to make fewer calls to the GPU during a predict(). I'm not sure web workers would help on our end but... maybe worth trying?

@cvalenzuela
Copy link
Member

This is now resolved. see #142

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants