Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add loaders for ELU, RELU activation layers (#78) #115

Closed
wants to merge 1 commit into from

Conversation

kokorins
Copy link
Contributor

  • missing activation layers loaders added
  • two distinct examples with save/load added to examples folder (trying to reach 0.7 accuracy)

Copy link
Collaborator

@zaleslaw zaleslaw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Really great and interesting examples

import org.jetbrains.kotlinx.dl.dataset.Dataset
import java.io.File

object SaveLoadExample {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you please add some KDocs here with the high-level description of this object (is it a helper object?)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kokorins SaveLoadExample should be imo a class as we see it can be parametrized

@@ -27,7 +27,7 @@ private val biasInitializer = GlorotUniform(SEED)
/**
* Returns classic LeNet-5 model with minor improvements (Sigmoid activation -> ReLU activation, AvgPool layer -> MaxPool layer).
*/
fun lenet5(): Sequential = Sequential.of(
fun lenet5(sigmoidActivations:Activations = Activations.Relu): Sequential = Sequential.of(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it a helpful change? Are you going to run it with another Activation function somewhere?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

btw it's name is missleading, as it is activation, not sigmoid

Copy link
Contributor Author

@kokorins kokorins Jun 15, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed, just followed the comments line above Sigmoid activation -> ReLU activation ;-)

@zaleslaw zaleslaw linked an issue Jun 15, 2021 that may be closed by this pull request
5 tasks
@zaleslaw zaleslaw added the Review This PR is under review label Jun 15, 2021
Copy link
Contributor

@avan1235 avan1235 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just my thoughts, think about them @kokorins. Btw @zaleslaw we can see that linting specificatin for code style is needed as e.g. there are no newlines at the end of files (while present in most of the files) and other inconsistencies in this area are present (while sould be pointed by some automatic rules)

@@ -398,7 +398,7 @@ private fun createActivationLayer(config: LayerConfig, name: String): Layer {

private fun createReLULayer(config: LayerConfig, name: String): Layer {
return ReLU(
maxValue = config.max_value!!.toFloat(),
maxValue = config.max_value?.toFloat(),
Copy link
Contributor

@avan1235 avan1235 Jun 15, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shouldn't it be left as here the config is supposed to have this value?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not fully understand notation, maxValue is nullable, whereas other fields are not.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was thinking that it shouldn't be nullable in config (and then the exception would be expected on !! as it was) but now I see that it might be configured as null and it has a special meaning thought. Btw @zaleslaw is an author of ReLU so he would be the best source of answer here how should be a nullable value interpreted from config

@@ -27,7 +27,7 @@ private val biasInitializer = GlorotUniform(SEED)
/**
* Returns classic LeNet-5 model with minor improvements (Sigmoid activation -> ReLU activation, AvgPool layer -> MaxPool layer).
*/
fun lenet5(): Sequential = Sequential.of(
fun lenet5(sigmoidActivations:Activations = Activations.Relu): Sequential = Sequential.of(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

btw it's name is missleading, as it is activation, not sigmoid

import org.jetbrains.kotlinx.dl.dataset.Dataset
import java.io.File

object SaveLoadExample {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kokorins SaveLoadExample should be imo a class as we see it can be parametrized

import java.io.File


private const val PATH_TO_MODEL = "savedmodels/elu_lenet_saveload"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

imo it's MODEL_SAVE_PATH as at the begging this file does not exist and it's going to be created after the save process (and then read, but saving is as important as reading)

- missing activation layers loaders added
- two distinct examples with save/load added to examples folder (trying to reach 0.7 accuracy)
@kokorins
Copy link
Contributor Author

Cant re-open this one, please continue with: #128

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Review This PR is under review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add missed loaders for the ReLU and ELU activation layers
3 participants