Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added TokenAndPositionEmbedding Class #91

Merged
merged 4 commits into from
Apr 16, 2022
Merged

Added TokenAndPositionEmbedding Class #91

merged 4 commits into from
Apr 16, 2022

Conversation

adhadse
Copy link
Contributor

@adhadse adhadse commented Apr 7, 2022

Resolves #85.
Hey @mattdangerw, this PR is open for review and feedback.

Copy link
Member

@mattdangerw mattdangerw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Left some initial comments.

keras_nlp/layers/token_and_position_embedding.py Outdated Show resolved Hide resolved
keras_nlp/layers/token_and_position_embedding.py Outdated Show resolved Hide resolved
keras_nlp/layers/token_and_position_embedding.py Outdated Show resolved Hide resolved
keras_nlp/layers/token_and_position_embedding_test.py Outdated Show resolved Hide resolved
keras_nlp/layers/token_and_position_embedding.py Outdated Show resolved Hide resolved
keras_nlp/layers/token_and_position_embedding.py Outdated Show resolved Hide resolved
keras_nlp/layers/token_and_position_embedding.py Outdated Show resolved Hide resolved
keras_nlp/layers/token_and_position_embedding.py Outdated Show resolved Hide resolved
keras_nlp/layers/token_and_position_embedding.py Outdated Show resolved Hide resolved
keras_nlp/layers/token_and_position_embedding.py Outdated Show resolved Hide resolved
@adhadse adhadse requested a review from mattdangerw April 8, 2022 11:02
Copy link
Member

@mattdangerw mattdangerw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Few more comments.

keras_nlp/layers/token_and_position_embedding.py Outdated Show resolved Hide resolved
keras_nlp/layers/token_and_position_embedding.py Outdated Show resolved Hide resolved
keras_nlp/layers/token_and_position_embedding.py Outdated Show resolved Hide resolved
keras_nlp/layers/token_and_position_embedding_test.py Outdated Show resolved Hide resolved
@adhadse
Copy link
Contributor Author

adhadse commented Apr 12, 2022

Hey @mattdangerw I have made the proposed changes.

@adhadse adhadse changed the title Added TokenAndPositionEmbedding Class and test for it's config Added TokenAndPositionEmbedding Class Apr 12, 2022
Copy link
Contributor

@haifeng-jin haifeng-jin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please ignore this comment.

Copy link
Member

@mattdangerw mattdangerw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Last few comments from me I think!

sequence.

Args:
`vocabulary_size`: The size of the vocabulary (should be no larger
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

don't backtick the arg names themselves

class TokenAndPositionEmbedding(keras.layers.Layer):
"""A layer which sums a token and position embedding.

This class assumes that in the input tensor, the last dimension corresponds
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this description is incorrect. In general the assumption should be an input of shape (batch_dim, sequence_dim) right? Assumption--the last dimension is the sequence dim.

And the output will be (batch_dim, sequence_dim, embedding_dim).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, after I updated the tests, this seems to be the incorrect. This will be fixed.

keras_nlp/layers/token_and_position_embedding.py Outdated Show resolved Hide resolved
@adhadse
Copy link
Contributor Author

adhadse commented Apr 14, 2022

Hey @mattdangerw, I hope these are final changes :) I tried formatting by running the scripts, but for some reason the lint fails for isort fails for layers/__init__.py even after running the command from terminal. Might need a deeper look into that #104).

@mattdangerw
Copy link
Member

@adhadse thanks! LGTM! I'll look at the linting issue later today, if I see a fix, I can just add that directly here and merge.

Looks like you have a merge conflict, can you rebase over latest master?

@mattdangerw
Copy link
Member

Formatting issue fixed in #117 if you rebase and possibly reformat, should be good to go.

@adhadse
Copy link
Contributor Author

adhadse commented Apr 15, 2022

Went through some hiccups, first time doing Rebase.

@mattdangerw
Copy link
Member

Thanks! Looks like there were a few comments marked resolved but unaddressed. Cleaned this up, but please keep an eye out for that in the future.

@mattdangerw mattdangerw merged commit 185de9c into keras-team:master Apr 16, 2022
adhadse added a commit to adhadse/keras-nlp that referenced this pull request Sep 17, 2022
* Added TokenAndPositionEmbedding Class and tests

* Added ragged tensor test, and model save test for TokenAndPositionEmbedding and bug fixes

* Added dense input tests for TokenAndPositionEmbedding

* Fix last few comments

Co-authored-by: Matt Watson <1389937+mattdangerw@users.noreply.github.com>
grasskin added a commit that referenced this pull request Jun 27, 2024
Add Gemma2 building blocks and presets.

---------

Co-authored-by: Matt Watson <mattdangerw@gmail.com>
grasskin added a commit that referenced this pull request Jun 27, 2024
Add Gemma2 building blocks and presets.

---------

Co-authored-by: Matt Watson <1389937+mattdangerw@users.noreply.github.com>
grasskin added a commit that referenced this pull request Jun 27, 2024
* Add Gemma2 to Keras (#91)

Add Gemma2 building blocks and presets.

---------

Co-authored-by: Matt Watson <1389937+mattdangerw@users.noreply.github.com>

* Set presets to one

* Remove extra preset test

* Pin Keras version to 3.3.3

---------

Co-authored-by: Matt Watson <1389937+mattdangerw@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add a token and position embedding layer
3 participants