-
Notifications
You must be signed in to change notification settings - Fork 7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove warnings pytest models #6593
Remove warnings pytest models #6593
Conversation
ambujpawar
commented
Sep 16, 2022
•
edited
Loading
edited
- Closes Fix warnings in the tests #6155
@ambujpawar Thanks for the PR. The tests are failing and look related. Could you please have a look? Thanks! |
Thanks for the review. I tried to resolve those errors by just modifying the tests. But apparently, it was not possible for inceptionv3 model. Please try to have a look at the solution :) |
Co-authored-by: Philip Meier <github.pmeier@posteo.de>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM if CI is green. Thanks @ambujpawar! The current failing jobs
- binary_linux_conda_py3.9_cu116
- cmake_macos_cpu
- cmake_windows_cpu
- cmake_windows_gpu
are unrelated.
Failing tests seems unrelated to the PR and I see that the related tasks are listed in issues |
The following could be related, but I'll leave that up to @datumbox since he knows the model tests better than I do.
|
I think it is unrelated, as it was happening over here #6589 before this PR was even submitted |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The highlighted failure on the fasterrcnn model is not related. The changes LGTM.
Only one question which we could resolve elsewhere.
@@ -142,6 +134,8 @@ def __init__( | |||
QuantizableInceptionE, | |||
QuantizableInceptionAux, | |||
], | |||
*args, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@pmeier Since you are the most Pythonic person I know, any idea why this works? We pass the inception_blocks
keyword parameter before *args. We do exactly the same thing on QuantizableGoogLeNet
. Could this because we dont actually have any positional args in the classes? I was expecting that the class would emit an error for putting positional arguments after named params but it doesn't.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is one of the Python quirks that are syntactically allowed, but should never be used due to its very weird behavior. TL;DR: foo(baz="baz", "bar")
is not valid syntax while foo(baz="baz", *["bar"])
is. See python/cpython#82741.
I would advise to put the *args
before the inception_blocks=...
parameter to avoid confusion. Same for all other occurrences of this pattern. I don't know of a linter that checks this for us though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done! 👍
LGTM now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure if we want to do it in this PR or split this off to a follow-up cc @datumbox. Nevertheless this bothered me a bit to much TBH. I wrote a small flake8
plugin to find all occurrences of this. Put the following in a nuakw.py
file in the root of the project
import ast
import contextlib
class NodeVisitor(ast.NodeVisitor):
def __init__(self):
self.errors = []
def visit_Call(self, node: ast.Call):
with contextlib.suppress(StopIteration):
starred = next(starred for starred in node.args if isinstance(starred, ast.Starred))
if any(
(keyword.value.lineno, keyword.value.col_offset) < (starred.lineno, starred.col_offset)
for keyword in node.keywords
):
self.errors.append(
(node.lineno, node.col_offset, "NUAKW Argument unpacking after keyword argument", type(self))
)
self.generic_visit(node)
class NoUnpackingAfterKeyword:
name = "no_unpacking_after_keyword"
version = "0.0.1"
def __init__(self, tree: ast.AST):
self._tree = tree
def run(self):
visitor = NodeVisitor()
visitor.visit(self._tree)
yield from visitor.errors
Now, put the following at the bottom of setup.cfg
[flake8:local-plugins]
extension =
NUAKW = nuakw:NoUnpackingAfterKeyword
paths =
./
Running flake8 torchvision
yields the following errors:
torchvision/models/quantization/inception.py:44:9: NUAKW Argument unpacking after keyword argument
torchvision/models/quantization/inception.py:55:9: NUAKW Argument unpacking after keyword argument
torchvision/models/quantization/inception.py:66:9: NUAKW Argument unpacking after keyword argument
torchvision/models/quantization/inception.py:77:9: NUAKW Argument unpacking after keyword argument
torchvision/models/quantization/inception.py:88:9: NUAKW Argument unpacking after keyword argument
torchvision/models/quantization/inception.py:122:9: NUAKW Argument unpacking after keyword argument
torchvision/models/quantization/googlenet.py:42:9: NUAKW Argument unpacking after keyword argument
torchvision/models/quantization/googlenet.py:53:9: NUAKW Argument unpacking after keyword argument
torchvision/models/quantization/googlenet.py:77:9: NUAKW Argument unpacking after keyword argument
torchvision/models/quantization/mobilenetv3.py:86:9: NUAKW Argument unpacking after keyword argument
If we fix one, we should fix all.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seemed to be a short fix, so I have implement this in this PR itself :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM (again) @ambujpawar. Thank you very much for being ultra patient and flexible with us. You did end up fixing many more issues than originally described. I do apologise for this but I think your PR leaves our codebase in much better state than before.
@pmeier Thanks a lot for the explanation you gave on the thread above. I agree that the aforementioned syntax is problematic and should be avoided. We can review whether its avoidance should be done via linter or via our code reviews, offline to avoid further delays on this work.
I just kicked the CI once more, we should be able to merge this once everything looks green. :)
Hey @datumbox! You merged this PR, but no labels were added. The list of valid labels is available at https://github.com/pytorch/vision/blob/main/.github/process_commit.py |
Thanks for the kind words. Looking forward to solving more issues |
Summary: * ADD: init_weights config for googlenet * Fix: Inception and googlenet warnings * Fix: warning in test_datasets.py * Fix: Formatting error with ufmt * Fix: Failing tests in quantized_classification_model * Update test/test_models.py to make googlenet in 1 line * Refactor: Change inception quantisation class initialization to use args/kwargs * Resolve mypy issue * Move *args before inception_blocks * Move args keywords before other arguments Reviewed By: NicolasHug Differential Revision: D39765307 fbshipit-source-id: 2b52fefd4c6c71a07d180247f098e44458e66e74 Co-authored-by: Philip Meier <github.pmeier@posteo.de> Co-authored-by: Ambuj Pawar <your_email@abc.example> Co-authored-by: Philip Meier <github.pmeier@posteo.de> Co-authored-by: Vasilis Vryniotis <datumbox@users.noreply.github.com>