Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gemma: Add logit soft-capping to score function. #1712

Merged
merged 1 commit into from
Jul 26, 2024

Conversation

RyanMullins
Copy link
Contributor

Gemma 2 added logit soft-capping to .call_with_cache() in #1673 but this was not paralleled in the .score() function, so the logits/loss and derived attributes (e.g., gradients) will differ from those returned by .generate(). This PR brings these back into parity.

@github-actions github-actions bot added the Gemma Gemma model specific issues label Jul 26, 2024
Copy link
Member

@mattdangerw mattdangerw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm! will pull in when tests finish

@mattdangerw mattdangerw merged commit fa0fbb7 into keras-team:master Jul 26, 2024
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Gemma Gemma model specific issues
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants