-
Notifications
You must be signed in to change notification settings - Fork 248
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improving the layer Description. #734
Conversation
I have added clear information regarding what the layer does. It was not specified earlier, because of it user gets in trouble about what it does and when to use it.
@@ -22,7 +22,9 @@ | |||
|
|||
@keras.utils.register_keras_serializable(package="keras_nlp") | |||
class TokenAndPositionEmbedding(keras.layers.Layer): | |||
"""A layer which sums a token and position embedding. | |||
"""Token and position embeddings are ways of representing words and their order in a sentence |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The first line of a docstring should always be a single sentence fragment <80 characters. Let's leave this unchanged
We could add a description to the next paragraph though! Maybe....
Token and position embeddings give a dense representation of the
input tokens, and their position in the sequence respectively. This
layer create a `keras.layers.Embedding` token embedding and a
`keras_nlp.layers.PositionEmbedding` position embedding and sums
their output when called. This layer assumes that the last dimension in
the input corresponds to the sequence dimension.
Updated the docstring as suggested by the reviewer.
@@ -23,7 +23,11 @@ | |||
@keras.utils.register_keras_serializable(package="keras_nlp") | |||
class TokenAndPositionEmbedding(keras.layers.Layer): | |||
"""A layer which sums a token and position embedding. | |||
|
|||
Token and position embeddings are ways of representing words and their order in a sentence |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- Add an newline between the docstring summary (the first line), and this one
- Let's delete "so that a machine learning model like a neural network can understand them" that feels a little under technical.
- Please format this so line lengths are <80 characters.
|
||
Token and position embeddings are ways of representing words and their order in a sentence | ||
so that a machine learning model like a neural network can understand them. This | ||
layer create a `keras.layers.Embedding` token embedding and a |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
create -> creates
I made some changes as the reviewer suggested.
Thanks! |
This broke on keras-team#734
I have added clear information regarding what the layer does. It was not specified earlier, because of it user gets in trouble about what it does and when to use it.