Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: use ruff formatter #2536

Merged
merged 7 commits into from
Nov 11, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions .devcontainer/devcontainer.json
Original file line number Diff line number Diff line change
Expand Up @@ -29,12 +29,11 @@
"mhutchie.git-graph",
"eamodio.gitlens",
"github.vscode-github-actions",
"ms-python.black-formatter",
"ms-python.mypy-type-checker",
"charliermarsh.ruff"
],
"settings": {
"python.editor.defaultFormatter": "ms-python.black-formatter",
"python.editor.defaultFormatter": "charliemarsh.ruff",
"python.defaultInterpreterPath": "${workspaceFolder}/.venv",
"python.terminal.activateEnvInCurrentTerminal": true,
"python.testing.unittestEnabled": false,
Expand Down
10 changes: 3 additions & 7 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,22 +23,18 @@ repos:
- id: unasyncd
additional_dependencies: ["ruff"]
- repo: https://github.com/charliermarsh/ruff-pre-commit
rev: "v0.1.2"
rev: "v0.1.4"
hooks:
- id: ruff
args: ["--fix"]
- id: ruff-format
- repo: https://github.com/codespell-project/codespell
rev: v2.2.6
hooks:
- id: codespell
exclude: "tests/openapi/typescript_converter/test_converter|README.md"
additional_dependencies:
- tomli
- repo: https://github.com/psf/black
rev: 23.10.1
hooks:
- id: black
args: [--config=./pyproject.toml]
- repo: https://github.com/asottile/blacken-docs
rev: 1.16.0
hooks:
Expand All @@ -55,7 +51,7 @@ repos:
exclude: "test*|examples*|tools"
args: ["--use-tuple"]
- repo: https://github.com/ariebovenberg/slotscheck
rev: v0.17.0
rev: v0.17.1
hooks:
- id: slotscheck
exclude: "test_*|docs"
Expand Down
8 changes: 7 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -80,6 +80,12 @@ mypy: ## Run mypy
@$(ENV_PREFIX)dmypy run
@echo "=> mypy complete"

.PHONY: mypy-nocache
mypy-nocache: ## Run Mypy without cache
@echo "=> Running mypy without a cache"
@$(ENV_PREFIX)dmypy run -- --cache-dir=/dev/null
peterschutt marked this conversation as resolved.
Show resolved Hide resolved
@echo "=> mypy complete"

.PHONY: pyright
pyright: ## Run pyright
@echo "=> Running pyright"
Expand All @@ -90,7 +96,7 @@ pyright: ## Run pyright
type-check: mypy pyright ## Run all type checking

.PHONY: pre-commit
pre-commit: ## Runs pre-commit hooks; includes ruff linting, codespell, black
pre-commit: ## Runs pre-commit hooks; includes ruff linting, codespell
@echo "=> Running pre-commit process"
@$(ENV_PREFIX)pre-commit run --all-files
@echo "=> Pre-commit complete"
Expand Down
14 changes: 7 additions & 7 deletions README.md

Large diffs are not rendered by default.

14 changes: 7 additions & 7 deletions docs/PYPI_README.md

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion docs/examples/parameters/path_parameters_3.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ def get_product_version(
url="https://mywebsite.com/documentation/product#versions", # type: ignore[arg-type]
),
),
]
],
) -> Version:
return VERSIONS[version]

Expand Down
3 changes: 2 additions & 1 deletion litestar/_asgi/routing_trie/mapping.py
Original file line number Diff line number Diff line change
Expand Up @@ -191,7 +191,8 @@ def build_route_middleware_stack(

# we wrap the route.handle method in the ExceptionHandlerMiddleware
asgi_handler = wrap_in_exception_handler(
app=route.handle, exception_handlers=route_handler.resolve_exception_handlers() # type: ignore[arg-type]
app=route.handle, # type: ignore[arg-type]
exception_handlers=route_handler.resolve_exception_handlers(),
peterschutt marked this conversation as resolved.
Show resolved Hide resolved
)

if app.csrf_config:
Expand Down
18 changes: 12 additions & 6 deletions litestar/_openapi/responses.py
Original file line number Diff line number Diff line change
Expand Up @@ -226,9 +226,12 @@ def create_error_responses(exceptions: list[type[HTTPException]]) -> Iterator[tu
with contextlib.suppress(Exception):
group_description = HTTPStatus(status_code).description

yield str(status_code), OpenAPIResponse(
description=group_description,
content={MediaType.JSON: OpenAPIMediaType(schema=schema)},
yield (
str(status_code),
OpenAPIResponse(
description=group_description,
content={MediaType.JSON: OpenAPIMediaType(schema=schema)},
),
)
peterschutt marked this conversation as resolved.
Show resolved Hide resolved


Expand All @@ -245,9 +248,12 @@ def create_additional_responses(
schema = schema_creator.for_field_definition(
FieldDefinition.from_annotation(additional_response.data_container)
)
yield str(status_code), OpenAPIResponse(
description=additional_response.description,
content={additional_response.media_type: OpenAPIMediaType(schema=schema)},
yield (
str(status_code),
OpenAPIResponse(
description=additional_response.description,
content={additional_response.media_type: OpenAPIMediaType(schema=schema)},
),
)


Expand Down
4 changes: 3 additions & 1 deletion litestar/_openapi/schema_generation/constrained_fields.py
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,9 @@ def create_string_constrained_field_schema(
schema.max_length = kwarg_definition.max_length
if kwarg_definition.pattern:
schema.pattern = (
kwarg_definition.pattern.pattern if isinstance(kwarg_definition.pattern, Pattern) else kwarg_definition.pattern # type: ignore[attr-defined,unreachable]
kwarg_definition.pattern.pattern # type: ignore[attr-defined]
if isinstance(kwarg_definition.pattern, Pattern) # type: ignore[unreachable]
else kwarg_definition.pattern
)
if kwarg_definition.lower_case:
schema.description = "must be in lower case"
Expand Down
3 changes: 2 additions & 1 deletion litestar/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -791,7 +791,8 @@ def _create_asgi_handler(self) -> ASGIApp:
asgi_handler = CORSMiddleware(app=asgi_handler, config=self.cors_config)

return wrap_in_exception_handler(
app=asgi_handler, exception_handlers=self.exception_handlers or {} # pyright: ignore
app=asgi_handler,
exception_handlers=self.exception_handlers or {}, # pyright: ignore
)

def _wrap_send(self, send: Send, scope: Scope) -> Send:
Expand Down
12 changes: 9 additions & 3 deletions litestar/connection/request.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,9 @@ def content_type(self) -> tuple[str, dict[str, str]]:
A tuple with the parsed value and a dictionary containing any options send in it.
"""
if self._content_type is Empty:
self._content_type = self.scope["_content_type"] = parse_content_header(self.headers.get("Content-Type", "")) # type: ignore[typeddict-unknown-key]
self._content_type = self.scope["_content_type"] = parse_content_header( # type: ignore[typeddict-unknown-key]
self.headers.get("Content-Type", "")
)
return cast("tuple[str, dict[str, str]]", self._content_type)

@property
Expand All @@ -104,7 +106,9 @@ async def json(self) -> Any:
"""
if self._json is Empty:
body = await self.body()
self._json = self.scope["_json"] = decode_json(body or b"null", type_decoders=self.route_handler.resolve_type_decoders()) # type: ignore[typeddict-unknown-key]
self._json = self.scope["_json"] = decode_json( # type: ignore[typeddict-unknown-key]
body or b"null", type_decoders=self.route_handler.resolve_type_decoders()
)
return self._json

async def msgpack(self) -> Any:
Expand All @@ -115,7 +119,9 @@ async def msgpack(self) -> Any:
"""
if self._msgpack is Empty:
body = await self.body()
self._msgpack = self.scope["_msgpack"] = decode_msgpack(body or b"\xc0", type_decoders=self.route_handler.resolve_type_decoders()) # type: ignore[typeddict-unknown-key]
self._msgpack = self.scope["_msgpack"] = decode_msgpack( # type: ignore[typeddict-unknown-key]
body or b"\xc0", type_decoders=self.route_handler.resolve_type_decoders()
)
return self._msgpack

async def stream(self) -> AsyncGenerator[bytes, None]:
Expand Down
3 changes: 2 additions & 1 deletion litestar/contrib/pydantic/pydantic_dto_factory.py
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,8 @@ def generate_field_definitions(
model_fields = dict(model_type.model_fields) # type: ignore[union-attr]
except AttributeError:
model_fields = {
k: model_field.field_info for k, model_field in model_type.__fields__.items() # type: ignore[union-attr]
k: model_field.field_info
for k, model_field in model_type.__fields__.items() # type: ignore[union-attr]
}

for field_name, field_info in model_fields.items():
Expand Down
8 changes: 2 additions & 6 deletions litestar/contrib/pydantic/pydantic_schema_plugin.py
Original file line number Diff line number Diff line change
Expand Up @@ -231,9 +231,7 @@ def to_openapi_schema(self, field_definition: FieldDefinition, schema_creator: S
return PYDANTIC_TYPE_MAP[field_definition.annotation] # pragma: no cover

@classmethod
def for_pydantic_model(
cls, field_definition: FieldDefinition, schema_creator: SchemaCreator
) -> Schema: # pyright: ignore
def for_pydantic_model(cls, field_definition: FieldDefinition, schema_creator: SchemaCreator) -> Schema: # pyright: ignore
peterschutt marked this conversation as resolved.
Show resolved Hide resolved
"""Create a schema object for a given pydantic model class.

Args:
Expand Down Expand Up @@ -265,9 +263,7 @@ def for_pydantic_model(
}

field_definitions = {
f.alias
if f.alias and schema_creator.prefer_alias
else k: FieldDefinition.from_kwarg(
f.alias if f.alias and schema_creator.prefer_alias else k: FieldDefinition.from_kwarg(
annotation=Annotated[annotation_hints[k], f, f.metadata] # type: ignore[union-attr]
if is_v2_model
else Annotated[annotation_hints[k], f], # pyright: ignore
Expand Down
8 changes: 5 additions & 3 deletions litestar/dto/dataclass_dto.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,9 +47,11 @@ def generate_field_definitions(
default=default,
)

yield replace(field_defintion, default=Empty, kwarg_definition=default) if isinstance(
default, (KwargDefinition, DependencyKwarg)
) else field_defintion
yield (
replace(field_defintion, default=Empty, kwarg_definition=default)
if isinstance(default, (KwargDefinition, DependencyKwarg))
else field_defintion
)

@classmethod
def detect_nested_field(cls, field_definition: FieldDefinition) -> bool:
Expand Down
4 changes: 1 addition & 3 deletions litestar/handlers/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -148,9 +148,7 @@ def __init__(
self.type_encoders = type_encoders

self.paths = (
{normalize_path(p) for p in path}
if path and isinstance(path, list)
else {normalize_path(path or "/")} # type: ignore
{normalize_path(p) for p in path} if path and isinstance(path, list) else {normalize_path(path or "/")} # type: ignore
peterschutt marked this conversation as resolved.
Show resolved Hide resolved
)

def __call__(self, fn: AsyncAnyCallable) -> Self:
Expand Down
5 changes: 4 additions & 1 deletion litestar/handlers/http_handlers/_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,10 @@ def create_response_handler(
cookie_list = list(cookies)

async def handler(
data: Response, app: Litestar, request: Request, **kwargs: Any # kwargs is for return dto
data: Response,
app: Litestar,
request: Request,
**kwargs: Any, # kwargs is for return dto
peterschutt marked this conversation as resolved.
Show resolved Hide resolved
) -> ASGIApp:
response = await after_request(data) if after_request else data # type:ignore[arg-type,misc]
return response.to_asgi_response( # type: ignore
Expand Down
12 changes: 9 additions & 3 deletions litestar/handlers/http_handlers/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -367,7 +367,9 @@ def resolve_before_request(self) -> AsyncCallable | None:
"""
if self._resolved_before_request is Empty:
before_request_handlers: list[AsyncCallable] = [
layer.before_request for layer in self.ownership_layers if layer.before_request # type: ignore[misc]
layer.before_request # type: ignore[misc]
for layer in self.ownership_layers
if layer.before_request
]
self._resolved_before_request = before_request_handlers[-1] if before_request_handlers else None
return cast("AsyncCallable | None", self._resolved_before_request)
Expand All @@ -383,7 +385,9 @@ def resolve_after_response(self) -> AsyncCallable | None:
"""
if self._resolved_after_response is Empty:
after_response_handlers: list[AsyncCallable] = [
layer.after_response for layer in self.ownership_layers if layer.after_response # type: ignore[misc]
layer.after_response # type: ignore[misc]
for layer in self.ownership_layers
if layer.after_response
]
self._resolved_after_response = after_response_handlers[-1] if after_response_handlers else None

Expand Down Expand Up @@ -419,7 +423,9 @@ def get_response_handler(self, is_response_type_data: bool = False) -> Callable[
"""
if self._response_handler_mapping["default_handler"] is Empty:
after_request_handlers: list[AsyncCallable] = [
layer.after_request for layer in self.ownership_layers if layer.after_request # type: ignore[misc]
layer.after_request # type: ignore[misc]
for layer in self.ownership_layers
if layer.after_request
]
after_request = cast(
"AfterRequestHookHandler | None",
Expand Down
6 changes: 5 additions & 1 deletion litestar/serialization/msgspec_hooks.py
Original file line number Diff line number Diff line change
Expand Up @@ -232,7 +232,11 @@ def decode_msgpack(value: bytes, target_type: type[T], type_decoders: TypeDecode
...


def decode_msgpack(value: bytes, target_type: type[T] | EmptyType = Empty, type_decoders: TypeDecodersSequence | None = None) -> Any: # type: ignore[misc]
def decode_msgpack( # type: ignore[misc]
value: bytes,
target_type: type[T] | EmptyType = Empty, # pyright: ignore
type_decoders: TypeDecodersSequence | None = None,
) -> Any:
"""Decode a MessagePack string/bytes into an object.

Args:
Expand Down
4 changes: 3 additions & 1 deletion litestar/testing/request_factory.py
Original file line number Diff line number Diff line change
Expand Up @@ -291,7 +291,9 @@ def _create_request_with_data(
if request_media_type == RequestEncodingType.JSON:
encoding_headers, stream = httpx_encode_json(data)
elif request_media_type == RequestEncodingType.MULTI_PART:
encoding_headers, stream = encode_multipart_data(cast("dict[str, Any]", data), files=files or [], boundary=None) # type: ignore[assignment]
encoding_headers, stream = encode_multipart_data( # type: ignore[assignment]
cast("dict[str, Any]", data), files=files or [], boundary=None
)
else:
encoding_headers, stream = encode_urlencoded_data(decode_json(value=encode_json(data)))
headers.update(encoding_headers)
Expand Down
8 changes: 2 additions & 6 deletions litestar/testing/transport.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,18 +83,14 @@ async def receive() -> ReceiveMessage:
def create_send(request: Request, context: SendReceiveContext) -> Send:
async def send(message: Message) -> None:
if message["type"] == "http.response.start":
assert not context[ # noqa: S101
"response_started"
], 'Received multiple "http.response.start" messages.'
assert not context["response_started"], 'Received multiple "http.response.start" messages.' # noqa: S101
context["raw_kwargs"]["status_code"] = message["status"]
context["raw_kwargs"]["headers"] = [
(k.decode("utf-8"), v.decode("utf-8")) for k, v in message.get("headers", [])
]
context["response_started"] = True
elif message["type"] == "http.response.body":
assert context[ # noqa: S101
"response_started"
], 'Received "http.response.body" without "http.response.start".'
assert context["response_started"], 'Received "http.response.body" without "http.response.start".' # noqa: S101
assert not context[ # noqa: S101
"response_complete"
].is_set(), 'Received "http.response.body" after response completed.'
Expand Down
4 changes: 1 addition & 3 deletions litestar/types/serialization.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,4 @@
)
EncodableMsgSpecType: TypeAlias = "Ext | Raw | Struct"
LitestarEncodableType: TypeAlias = "EncodableBuiltinType | EncodableBuiltinCollectionType | EncodableStdLibType | EncodableStdLibIPType | EncodableMsgSpecType | BaseModel | AttrsInstance" # pyright: ignore
DataContainerType: TypeAlias = (
"Struct | BaseModel | AttrsInstance | TypedDictClass | DataclassProtocol" # pyright: ignore
)
DataContainerType: TypeAlias = "Struct | BaseModel | AttrsInstance | TypedDictClass | DataclassProtocol" # pyright: ignore
3 changes: 1 addition & 2 deletions litestar/utils/signature.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,8 +43,7 @@ def _get_defaults(_: Any) -> Any:
for namespace, export in chain(
tuple(getmembers(types)), tuple(getmembers(connection)), tuple(getmembers(datastructures))
)
if namespace[0].isupper()
and namespace in chain(types.__all__, connection.__all__, datastructures.__all__) # pyright: ignore
if namespace[0].isupper() and namespace in chain(types.__all__, connection.__all__, datastructures.__all__) # pyright: ignore
}
"""A mapping of names used for handler signature forward-ref resolution.

Expand Down
14 changes: 7 additions & 7 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,6 @@ docs = [
]
linting = [
"ruff",
"black",
"mypy",
"pre-commit",
"slotscheck",
Expand Down Expand Up @@ -172,10 +171,6 @@ docs-serve = "sphinx-autobuild docs docs/_build/ -j auto --watch litestar --watc
lint = "pre-commit run --all-files"
test = "pytest tests docs/examples"

[tool.black]
include = '\.pyi?$'
line-length = 120

[tool.codespell]
ignore-words-list = "selectin"
skip = 'pdm.lock,docs/examples/contrib/sqlalchemy/us_state_lookup.json'
Expand Down Expand Up @@ -297,7 +292,7 @@ select = [
"G", # flake8-logging-format
"I", # isort
"ICN", # flake8-import-conventions
"ISC", # flake8-implicit-str-concat
"ISC", # flake8-implicit-str-concat # Ruff Formatter conflicts with this
"N", # pep8-naming
"PIE", # flake8-pie
"PLC", # pylint - convention
Expand Down Expand Up @@ -332,9 +327,10 @@ ignore = [
"D202", # pydocstyle - no blank lines allowed after function docstring
"D205", # pydocstyle - 1 blank line required between summary line and description
"D415", # pydocstyle - first line should end with a period, question mark, or exclamation point
"E501", # pycodestyle line too long, handled by black
"E501", # pycodestyle line too long, handled by ruff-fmt
"PLW2901", # pylint - for loop variable overwritten by assignment target
"RUF012", # Ruff-specific rule - annotated with classvar
"ISC001", # flake8-implicit-str-concat - implicit string concatenation - ruff formatter has issue with this
]
line-length = 120
src = ["litestar", "tests", "docs/examples"]
Expand All @@ -359,6 +355,10 @@ classmethod-decorators = [
[tool.ruff.isort]
known-first-party = ["litestar", "tests", "examples"]

[tool.ruff.format]
quote-style = "double"
indent-style = "space"

[tool.ruff.per-file-ignores]
"docs/**/*.*" = ["S", "B", "DTZ", "A", "TCH", "ERA", "D", "RET"]
"docs/examples/**" = ["T201"]
Expand Down
Loading
Loading