Skip to content

Commit

Permalink
chore: enable RUF ruleset for ruff (#2677)
Browse files Browse the repository at this point in the history
# Description
This PR proposes to enable the
[`RUF`](https://docs.astral.sh/ruff/rules/#ruff-specific-rules-ruf)
ruleset.

```
ruff check .
```

returned:

```
warning: The top-level linter settings are deprecated in favour of their counterparts in the `lint` section. Please update the following options in `pyproject.toml`:
  - 'ignore' -> 'lint.ignore'
  - 'select' -> 'lint.select'
  - 'isort' -> 'lint.isort'
deltalake/table.py:52:26: RUF100 [*] Unused `noqa` directive (unused: `F811`)
deltalake/writer.py:63:26: RUF100 [*] Unused `noqa` directive (unused: `F811`)
tests/pyspark_integration/test_write_to_pyspark.py:109:37: RUF010 [*] Use explicit conversion flag
Found 3 errors.
[*] 3 fixable with the `--fix` option.
```

So these were simply fixed with `ruff check . --fix`.
#2673 handles the fixing of the
outdated config.

---------

Co-authored-by: R. Tyler Croy <rtyler@brokenco.de>
  • Loading branch information
fpgmaas and rtyler authored Jul 18, 2024
1 parent 640ee6e commit ab977e3
Show file tree
Hide file tree
Showing 4 changed files with 6 additions and 4 deletions.
2 changes: 1 addition & 1 deletion python/deltalake/table.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@
from deltalake.schema import Schema as DeltaSchema

try:
import pandas as pd # noqa: F811
import pandas as pd
except ModuleNotFoundError:
_has_pandas = False
else:
Expand Down
2 changes: 1 addition & 1 deletion python/deltalake/writer.py
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@
)

try:
import pandas as pd # noqa: F811
import pandas as pd
except ModuleNotFoundError:
_has_pandas = False
else:
Expand Down
4 changes: 3 additions & 1 deletion python/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,9 @@ select = [
# pyflakes
"F",
# isort
"I"
"I",
# ruff-specific rules
"RUF"
]
ignore = ["E501"]

Expand Down
2 changes: 1 addition & 1 deletion python/tests/pyspark_integration/test_write_to_pyspark.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ def test_checks_min_writer_version(tmp_path: pathlib.Path):
)

# Add a constraint upgrades the minWriterProtocol
spark.sql(f"ALTER TABLE delta.`{str(tmp_path)}` ADD CONSTRAINT x CHECK (c1 > 2)")
spark.sql(f"ALTER TABLE delta.`{tmp_path!s}` ADD CONSTRAINT x CHECK (c1 > 2)")

with pytest.raises(
DeltaProtocolError, match="This table's min_writer_version is 3, but"
Expand Down

0 comments on commit ab977e3

Please sign in to comment.