-
-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Browser support (discussion) #3
Comments
Very cool! I'll need to read up more on how that would work, but the fact that it loads is a good start. Also the colab notebook seems to work fairly well for training with the split_data param I added, need to test that more though. |
Sharing some great progress on the browser runtime:
The prediction run slowly on my 13-inch MacBook Pro (iGPU), but i will make a test on a PC with GPU In the zip file you'll find the |
actually spent whole night getting that to work, but totally worth it |
Great stuff, yeah once you get on a roll it's hard to stop, get some sleep! |
Kept my promise and made the web browser version possible get the example tensor here (it's so big i had to use git-lfs): https://github.com/mishushakov/GuitarLSTM-browser/raw/master/samples/tensor.json if you're interested how it all works, feel free to fork the repo: https://github.com/mishushakov/GuitarLSTM-browser thank you everyone for your help in making this possible 😃 |
so cool, even works on my iPhone lol |
@mishushakov Can't wait to try it out! Nice work! |
on the training side it is possible to convert the the js models to keras (h5)
taken from https://github.com/tensorflow/tfjs/tree/master/tfjs-converter#javascript-to-python |
I think so, but for the plugin I was planning on using json format for the models anyway. H5 is just another data format, as long as all the weights are in the json file we can make it work. And we can work together to make sure the models trained from the browser can be loaded into the future plugin. H5 is nice because it compresses the data, but I think json is the better option because it's readable in any text editor, and I'm more familiar with loading json in c++. |
@GuitarML not sure if it's a good idea once you have that, you can load the model in every version of tensorflow (you read it right, tensorflow, not just keras) if this is still isn't enough you could use tensorflow-onnx to convert your model to onnx and load it with onnx runtime i've been actually wondering whether we can make both networks run in your plugin |
Hey there,
i was able to load the model successfully in browser using tensorflow.js
to convert the models you can use TensorFlow.js converter
this issue is a backlog for inferencing and (possibly) training models in browsers
The text was updated successfully, but these errors were encountered: