-
Notifications
You must be signed in to change notification settings - Fork 228
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve error handling for non-keras model loading attempts #1577
Improve error handling for non-keras model loading attempts #1577
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thank you!
keras_nlp/models/task_test.py
Outdated
@@ -69,6 +69,9 @@ def test_from_preset_errors(self): | |||
with self.assertRaises(ValueError): | |||
# No loading on an incorrect class. | |||
BertClassifier.from_preset("gpt2_base_en", load_weights=False) | |||
with self.assertRaises(FileNotFoundError): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In general as a best practice unit tests should use assertRaisesRegex
, because sometimes a different exception with the same type can be raised (not the one you expect). In this case it doesn't matter because the exception class is fairly specific.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Make sense! I updated these unit tests to use assertRaisesRegex
.
This PR addresses #1574. At the moment,
from_preset
can only load Keras models so if a user attempts to load a non-Keras model, we should inform them that only Keras models are valid.