Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ADD] Stricter checks mypy #240

Merged
merged 19 commits into from
Jun 18, 2021

Conversation

ravinkohli
Copy link
Contributor

@ravinkohli ravinkohli commented May 25, 2021

This PR partially addresses #230.

  1. Added mypy checks for warn-redundant-casts, warn-return-any, warn-unreachable,

@ravinkohli ravinkohli changed the title Stricter checks mypy, flake and black Stricter checks mypy May 25, 2021
@ravinkohli ravinkohli changed the title Stricter checks mypy [ADD] Stricter checks mypy May 28, 2021
Copy link
Collaborator

@nabenabe0928 nabenabe0928 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR, I will put some good practices for mypy typing:

  1. class rather than Dict[str, Any]
    Because Dict[str, Any] is not supported by most languages, because of lower safety.

  2. sometimes it is effective to declare typing variables as a global variable to comprehend inputs as a same group.

For python3.8 >=,

  1. Generic class
    We can use something like template in C++
    We can separate Union depending on files.

  2. Literal
    we can accept only specific variable rather than typing

  3. Final
    If we do not hope the change in variables, we can use it.

  4. TypedDict
    We can define meta class for dict.

Also we need to discuss (with Frank as well) when we should stop the support of older python versions.
Just in case, python3.6 is already not supported officially and python3.7 will be not supported from this December.

autoPyTorch/pipeline/base_pipeline.py Show resolved Hide resolved
autoPyTorch/pipeline/base_pipeline.py Outdated Show resolved Hide resolved
autoPyTorch/pipeline/base_pipeline.py Show resolved Hide resolved
autoPyTorch/pipeline/image_classification.py Outdated Show resolved Hide resolved
autoPyTorch/pipeline/tabular_regression.py Outdated Show resolved Hide resolved
@codecov
Copy link

codecov bot commented Jun 11, 2021

Codecov Report

Merging #240 (64ce7e5) into development (1818445) will increase coverage by 0.79%.
The diff coverage is 83.02%.

Impacted file tree graph

@@               Coverage Diff               @@
##           development     #240      +/-   ##
===============================================
+ Coverage        80.82%   81.62%   +0.79%     
===============================================
  Files              148      150       +2     
  Lines             8563     8625      +62     
  Branches          1323     1325       +2     
===============================================
+ Hits              6921     7040     +119     
+ Misses            1163     1108      -55     
+ Partials           479      477       -2     
Impacted Files Coverage Δ
autoPyTorch/api/tabular_regression.py 96.87% <ø> (ø)
autoPyTorch/data/base_target_validator.py 97.95% <ø> (-0.05%) ⬇️
autoPyTorch/data/tabular_feature_validator.py 86.30% <ø> (-0.09%) ⬇️
...ts/setup/network_backbone/base_network_backbone.py 88.63% <0.00%> (ø)
...components/setup/network_head/base_network_head.py 96.15% <0.00%> (ø)
...Torch/pipeline/components/training/metrics/base.py 65.67% <ø> (ø)
...PyTorch/pipeline/traditional_tabular_regression.py 25.86% <25.86%> (ø)
...ine/components/setup/network_embedding/__init__.py 64.89% <44.44%> (+0.76%) ⬆️
autoPyTorch/evaluation/abstract_evaluator.py 75.89% <46.87%> (-2.28%) ⬇️
...h/pipeline/components/training/trainer/__init__.py 70.81% <60.00%> (+0.48%) ⬆️
... and 102 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update eda89f4...64ce7e5. Read the comment docs.

Copy link
Contributor

@franchuterivera franchuterivera left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR. I left some questions. Let me know what you think about them.

@@ -194,8 +194,8 @@ def _check_data(
A set of features whose dimensionality and data type is going to be checked
"""

if not isinstance(y, (np.ndarray, pd.DataFrame, typing.List, pd.Series)) and not \
scipy.sparse.issparse(y): # type: ignore[misc]
if not isinstance(y, (np.ndarray, pd.DataFrame, # type: ignore[misc]
Copy link
Collaborator

@nabenabe0928 nabenabe0928 Jun 15, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can declare a temporal variable for the types.
Anyways, the variable scope here is small enough and it will increase the readability.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, we already have a variable for the types called SUPPORTED_TARGET_TYPE which we can use directly using isinstance checks, however, to check for sparsity we do not use isinstance rather we use scipy.issparse and hence I did not think it was appropriate to make another variable just for this type check.

@ravinkohli ravinkohli marked this pull request as draft June 16, 2021 10:34
@ravinkohli ravinkohli marked this pull request as ready for review June 16, 2021 10:34
@franchuterivera franchuterivera merged commit 999f3c3 into automl:development Jun 18, 2021
github-actions bot pushed a commit that referenced this pull request Jun 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants