-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add OffsetScaling predictor #87
Comments
Admittedly, this is a simple affine transformation, but it is one that is almost always needed when embedding an ML model into an optimization problem. So, automating this adds a convenience factor. Moreover, I find it best practice to ship trained ML models with the preprocessing layers that encode and decode the inputs and outputs, respectively. See https://www.tensorflow.org/guide/keras/preprocessing_layers#benefits_of_doing_preprocessing_inside_the_model_at_inference_time. Supporting these types of layers helps to simplify the workflow and reduce the chance of modelling errors. For instance, if I train a Keras or PyTorch NN model and embed the normalization as a layer for inference, I would be ideal to then just have MathOptAI just read in that model such that I wouldn't need to worry about normalizing the variables. Otherwise, I would have manually look up the scaling values in Keras or PyTorch and then input these manually as transformations in MathOptAI. |
What PyTorch normalization layers do you want support for? |
Let's follow (F)lux and call this |
That works, I have mostly used Keras in the past, so I am not sure what the equivalent layer is in PyTorch. |
It probably just needs to be:
But @pulsipher thinks this is useful in #82.
The text was updated successfully, but these errors were encountered: