You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, what about moving some aspects of Layer logic into separate interfaces, to offload Layer class?
For example, introduce ParametrizedLayer/TrainableLayer interfaces to encapsulate isTranable / paramCount logic and will be implemented only by layers that actually have variables/can be trained
public interface ParametrizedLayer {
/**
* Layer's variables
*/
public val variables: List<Variable>
/**
* Returns amount of parameters
*/
public val ParametrizedLayer.paramCount: Int
get() = variables.sumOf { it.shape.numElements() }.toInt()
}
public interface TrainableLayer : ParametrizedLayer {
/**
* True, if layer's weights could be changed during training.
* If false, layer's weights are frozen and could not be changed during the training.
*/
public var isTrainable: Boolean
}
Benefits:
Layer class itself will become thinner and hence easier to grasp
Untrainable by nature (for example Flatten,Input,ActivationLayer) layers will not even have isTrainable flag and weight field
Main idea here is that the code Flatten().isTrainable = true logically doesn't make sense, and with ParametrizedLayer it will not compile
Instead of marking layers with NoGradient it will be possible just not to implement Trainable which will work as whitelisting for training (instead of current blacklisting) which is more explicit.
It will be possible to remove KGraph parameter from Layer::build method
It will be possible to move weight setter/getter to the model (example knok16@4dc0286)
I'd like this idea in common, but suggest postponing for a 0.4 release.
What is about arranging a call to discuss the proposals? Could you write me on kotlinlang slack if you are ready to talk about it?
zaleslaw
changed the title
Additional typing for layers
[API] Additional typing for layers
Sep 15, 2021
knok16
changed the title
[API] Additional typing for layers
[API] [Layers] ParametrizedLayer/TrainableLayer interfaces to represent presence of parameters and trainability of the layer
Sep 18, 2021
Hello, what about moving some aspects of Layer logic into separate interfaces, to offload Layer class?
For example, introduce ParametrizedLayer/TrainableLayer interfaces to encapsulate
isTranable
/paramCount
logic and will be implemented only by layers that actually have variables/can be trainedBenefits:
Flatten
,Input
,ActivationLayer
) layers will not even haveisTrainable
flag andweight
fieldFlatten().isTrainable = true
logically doesn't make sense, and withParametrizedLayer
it will not compileNoGradient
it will be possible just not to implementTrainable
which will work as whitelisting for training (instead of current blacklisting) which is more explicit.weight
setter/getter to the model (example knok16@4dc0286)Reference knok16@f5244b6
The text was updated successfully, but these errors were encountered: