-
Notifications
You must be signed in to change notification settings - Fork 105
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add loaders for ELU, RELU activation layers (#78) #115
Conversation
kokorins
commented
Jun 14, 2021
- missing activation layers loaders added
- two distinct examples with save/load added to examples folder (trying to reach 0.7 accuracy)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Really great and interesting examples
import org.jetbrains.kotlinx.dl.dataset.Dataset | ||
import java.io.File | ||
|
||
object SaveLoadExample { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you please add some KDocs here with the high-level description of this object (is it a helper object?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kokorins SaveLoadExample should be imo a class as we see it can be parametrized
@@ -27,7 +27,7 @@ private val biasInitializer = GlorotUniform(SEED) | |||
/** | |||
* Returns classic LeNet-5 model with minor improvements (Sigmoid activation -> ReLU activation, AvgPool layer -> MaxPool layer). | |||
*/ | |||
fun lenet5(): Sequential = Sequential.of( | |||
fun lenet5(sigmoidActivations:Activations = Activations.Relu): Sequential = Sequential.of( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it a helpful change? Are you going to run it with another Activation function somewhere?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
btw it's name is missleading, as it is activation, not sigmoid
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed, just followed the comments line above Sigmoid activation -> ReLU activation
;-)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just my thoughts, think about them @kokorins. Btw @zaleslaw we can see that linting specificatin for code style is needed as e.g. there are no newlines at the end of files (while present in most of the files) and other inconsistencies in this area are present (while sould be pointed by some automatic rules)
@@ -398,7 +398,7 @@ private fun createActivationLayer(config: LayerConfig, name: String): Layer { | |||
|
|||
private fun createReLULayer(config: LayerConfig, name: String): Layer { | |||
return ReLU( | |||
maxValue = config.max_value!!.toFloat(), | |||
maxValue = config.max_value?.toFloat(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shouldn't it be left as here the config is supposed to have this value?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not fully understand notation, maxValue is nullable, whereas other fields are not.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was thinking that it shouldn't be nullable in config (and then the exception would be expected on !! as it was) but now I see that it might be configured as null and it has a special meaning thought. Btw @zaleslaw is an author of ReLU so he would be the best source of answer here how should be a nullable value interpreted from config
@@ -27,7 +27,7 @@ private val biasInitializer = GlorotUniform(SEED) | |||
/** | |||
* Returns classic LeNet-5 model with minor improvements (Sigmoid activation -> ReLU activation, AvgPool layer -> MaxPool layer). | |||
*/ | |||
fun lenet5(): Sequential = Sequential.of( | |||
fun lenet5(sigmoidActivations:Activations = Activations.Relu): Sequential = Sequential.of( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
btw it's name is missleading, as it is activation, not sigmoid
import org.jetbrains.kotlinx.dl.dataset.Dataset | ||
import java.io.File | ||
|
||
object SaveLoadExample { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kokorins SaveLoadExample should be imo a class as we see it can be parametrized
import java.io.File | ||
|
||
|
||
private const val PATH_TO_MODEL = "savedmodels/elu_lenet_saveload" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
imo it's MODEL_SAVE_PATH as at the begging this file does not exist and it's going to be created after the save process (and then read, but saving is as important as reading)
- missing activation layers loaders added - two distinct examples with save/load added to examples folder (trying to reach 0.7 accuracy)
Cant re-open this one, please continue with: #128 |