Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ruff 2024.2 style #9639

Merged
merged 1 commit into from
Feb 29, 2024
Merged

Ruff 2024.2 style #9639

merged 1 commit into from
Feb 29, 2024

Conversation

MichaReiser
Copy link
Member

@MichaReiser MichaReiser commented Jan 25, 2024

Summary

This PR promotes Ruff's preview formatter styles to stable.

  • fix_power_op_line_length
  • prefer_splitting_right_hand_side_of_assignments
  • parenthesize_long_type_hints
  • no_blank_line_before_class_docstring
  • wrap_multiple_context_managers_in_parens
  • blank_line_after_nested_stub_class
  • module_docstring_newlines
  • dummy_implementations
  • hex_codes_in_unicode_sequences
  • multiline_string_handling
  • format_module_docstring

Differences to black:

  • Black didn't stabilize multiline-string
  • Black didn't stabilize is_hug_parens_with_braces_and_square_brackets
  • Black didn't stabilize hex_codes_in_unicode_sequences

#8678

Test Plan

@MichaReiser MichaReiser added the formatter Related to the formatter label Jan 25, 2024
Copy link
Contributor

github-actions bot commented Jan 25, 2024

ruff-ecosystem results

Linter (stable)

ℹ️ ecosystem check encountered linter errors. (no lint changes; 1 project error)

indico/indico (error)

ruff failed
  Cause: Rule `S410` was removed and cannot be selected.

Linter (preview)

ℹ️ ecosystem check encountered linter errors. (no lint changes; 1 project error)

indico/indico (error)

ruff check --no-cache --exit-zero --ignore RUF9 --output-format concise --preview

ruff failed
  Cause: Rule `S410` was removed and cannot be selected.

Formatter (stable)

ℹ️ ecosystem check detected format changes. (+2653 -3676 lines in 1013 files in 27 projects; 2 project errors; 14 projects unchanged)

DisnakeDev/disnake (+74 -144 lines across 6 files)

disnake/audit_logs.py~L302

 
     if TYPE_CHECKING:
 
-        def __getattr__(self, item: str) -> Any:
-            ...
+        def __getattr__(self, item: str) -> Any: ...
 
-        def __setattr__(self, key: str, value: Any) -> Any:
-            ...
+        def __setattr__(self, key: str, value: Any) -> Any: ...
 
 
 Transformer = Callable[["AuditLogEntry", Any], Any]

disnake/channel.py~L329

         category: Optional[Snowflake] = ...,
         sync_permissions: bool = ...,
         reason: Optional[str] = ...,
-    ) -> None:
-        ...
+    ) -> None: ...
 
     # only passing `sync_permissions` may or may not return a channel,
     # depending on whether the channel is in a category

disnake/channel.py~L340

         *,
         sync_permissions: bool,
         reason: Optional[str] = ...,
-    ) -> Optional[TextChannel]:
-        ...
+    ) -> Optional[TextChannel]: ...
 
     @overload
     async def edit(

disnake/channel.py~L360

         overwrites: Mapping[Union[Role, Member], PermissionOverwrite] = ...,
         flags: ChannelFlags = ...,
         reason: Optional[str] = ...,
-    ) -> TextChannel:
-        ...
+    ) -> TextChannel: ...
 
     async def edit(
         self,

disnake/channel.py~L926

         auto_archive_duration: Optional[AnyThreadArchiveDuration] = None,
         slowmode_delay: Optional[int] = None,
         reason: Optional[str] = None,
-    ) -> Thread:
-        ...
+    ) -> Thread: ...
 
     @overload
     async def create_thread(

disnake/channel.py~L939

         invitable: Optional[bool] = None,
         slowmode_delay: Optional[int] = None,
         reason: Optional[str] = None,
-    ) -> Thread:
-        ...
+    ) -> Thread: ...
 
     async def create_thread(
         self,

disnake/channel.py~L1484

         category: Optional[Snowflake] = ...,
         sync_permissions: bool = ...,
         reason: Optional[str] = ...,
-    ) -> None:
-        ...
+    ) -> None: ...
 
     # only passing `sync_permissions` may or may not return a channel,
     # depending on whether the channel is in a category

disnake/channel.py~L1495

         *,
         sync_permissions: bool,
         reason: Optional[str] = ...,
-    ) -> Optional[VoiceChannel]:
-        ...
+    ) -> Optional[VoiceChannel]: ...
 
     @overload
     async def edit(

disnake/channel.py~L1515

         slowmode_delay: Optional[int] = ...,
         flags: ChannelFlags = ...,
         reason: Optional[str] = ...,
-    ) -> VoiceChannel:
-        ...
+    ) -> VoiceChannel: ...
 
     async def edit(
         self,

disnake/channel.py~L2301

         category: Optional[Snowflake] = ...,
         sync_permissions: bool = ...,
         reason: Optional[str] = ...,
-    ) -> None:
-        ...
+    ) -> None: ...
 
     # only passing `sync_permissions` may or may not return a channel,
     # depending on whether the channel is in a category

disnake/channel.py~L2312

         *,
         sync_permissions: bool,
         reason: Optional[str] = ...,
-    ) -> Optional[StageChannel]:
-        ...
+    ) -> Optional[StageChannel]: ...
 
     @overload
     async def edit(

disnake/channel.py~L2332

         slowmode_delay: Optional[int] = ...,
         flags: ChannelFlags = ...,
         reason: Optional[str] = ...,
-    ) -> StageChannel:
-        ...
+    ) -> StageChannel: ...
 
     async def edit(
         self,

disnake/channel.py~L2863

         *,
         position: int,
         reason: Optional[str] = ...,
-    ) -> None:
-        ...
+    ) -> None: ...
 
     @overload
     async def edit(

disnake/channel.py~L2876

         overwrites: Mapping[Union[Role, Member], PermissionOverwrite] = ...,
         flags: ChannelFlags = ...,
         reason: Optional[str] = ...,
-    ) -> CategoryChannel:
-        ...
+    ) -> CategoryChannel: ...
 
     async def edit(
         self,

disnake/channel.py~L2963

         offset: int = ...,
         sync_permissions: bool = ...,
         reason: Optional[str] = ...,
-    ) -> None:
-        ...
+    ) -> None: ...
 
     @overload
     async def move(

disnake/channel.py~L2974

         offset: int = ...,
         sync_permissions: bool = ...,
         reason: Optional[str] = ...,
-    ) -> None:
-        ...
+    ) -> None: ...
 
     @overload
     async def move(

disnake/channel.py~L2985

         offset: int = ...,
         sync_permissions: bool = ...,
         reason: Optional[str] = ...,
-    ) -> None:
-        ...
+    ) -> None: ...
 
     @overload
     async def move(

disnake/channel.py~L2996

         offset: int = ...,
         sync_permissions: bool = ...,
         reason: Optional[str] = ...,
-    ) -> None:
-        ...
+    ) -> None: ...
 
     @utils.copy_doc(disnake.abc.GuildChannel.move)
     async def move(self, **kwargs) -> None:

disnake/channel.py~L3406

         view: View = ...,
         components: Components = ...,
         reason: Optional[str] = None,
-    ) -> ThreadWithMessage:
-        ...
+    ) -> ThreadWithMessage: ...
 
     @overload
     async def create_thread(

disnake/channel.py~L3427

         view: View = ...,
         components: Components = ...,
         reason: Optional[str] = None,
-    ) -> ThreadWithMessage:
-        ...
+    ) -> ThreadWithMessage: ...
 
     @overload
     async def create_thread(

disnake/channel.py~L3448

         view: View = ...,
         components: Components = ...,
         reason: Optional[str] = None,
-    ) -> ThreadWithMessage:
-        ...
+    ) -> ThreadWithMessage: ...
 
     @overload
     async def create_thread(

disnake/channel.py~L3469

         view: View = ...,
         components: Components = ...,
         reason: Optional[str] = None,
-    ) -> ThreadWithMessage:
-        ...
+    ) -> ThreadWithMessage: ...
 
     async def create_thread(
         self,

disnake/channel.py~L3901

         category: Optional[Snowflake] = ...,
         sync_permissions: bool = ...,
         reason: Optional[str] = ...,
-    ) -> None:
-        ...
+    ) -> None: ...
 
     # only passing `sync_permissions` may or may not return a channel,
     # depending on whether the channel is in a category

disnake/channel.py~L3912

         *,
         sync_permissions: bool,
         reason: Optional[str] = ...,
-    ) -> Optional[ForumChannel]:
-        ...
+    ) -> Optional[ForumChannel]: ...
 
     @overload
     async def edit(

disnake/channel.py~L3936

         default_sort_order: Optional[ThreadSortOrder] = ...,
         default_layout: ThreadLayout = ...,
         reason: Optional[str] = ...,
-    ) -> ForumChannel:
-        ...
+    ) -> ForumChannel: ...
 
     async def edit(
         self,

disnake/channel.py~L4331

         category: Optional[Snowflake] = ...,
         sync_permissions: bool = ...,
         reason: Optional[str] = ...,
-    ) -> None:
-        ...
+    ) -> None: ...
 
     # only passing `sync_permissions` may or may not return a channel,
     # depending on whether the channel is in a category

disnake/channel.py~L4342

         *,
         sync_permissions: bool,
         reason: Optional[str] = ...,
-    ) -> Optional[MediaChannel]:
-        ...
+    ) -> Optional[MediaChannel]: ...
 
     @overload
     async def edit(

disnake/channel.py~L4365

         default_reaction: Optional[Union[str, Emoji, PartialEmoji]] = ...,
         default_sort_order: Optional[ThreadSortOrder] = ...,
         reason: Optional[str] = ...,
-    ) -> MediaChannel:
-        ...
+    ) -> MediaChannel: ...
 
     async def edit(
         self,

disnake/ext/commands/core.py~L1265

         cls: Type[CommandT],
         *args: Any,
         **kwargs: Any,
-    ) -> Callable[[CommandCallback[CogT, ContextT, P, T]], CommandT]:
-        ...
+    ) -> Callable[[CommandCallback[CogT, ContextT, P, T]], CommandT]: ...
 
     @overload
     def command(

disnake/ext/commands/core.py~L1275

         *args: Any,
         cls: Type[CommandT],
         **kwargs: Any,
-    ) -> Callable[[CommandCallback[CogT, ContextT, P, T]], CommandT]:
-        ...
+    ) -> Callable[[CommandCallback[CogT, ContextT, P, T]], CommandT]: ...
 
     @overload
     def command(

disnake/ext/commands/core.py~L1284

         name: str = ...,
         *args: Any,
         **kwargs: Any,
-    ) -> Callable[[CommandCallback[CogT, ContextT, P, T]], Command[CogT, P, T]]:
-        ...
+    ) -> Callable[[CommandCallback[CogT, ContextT, P, T]], Command[CogT, P, T]]: ...
 
     def command(
         self,

disnake/ext/commands/core.py~L1318

         cls: Type[GroupT],
         *args: Any,
         **kwargs: Any,
-    ) -> Callable[[CommandCallback[CogT, ContextT, P, T]], GroupT]:
-        ...
+    ) -> Callable[[CommandCallback[CogT, ContextT, P, T]], GroupT]: ...
 
     @overload
     def group(

disnake/ext/commands/core.py~L1328

         *args: Any,
         cls: Type[GroupT],
         **kwargs: Any,
-    ) -> Callable[[CommandCallback[CogT, ContextT, P, T]], GroupT]:
-        ...
+    ) -> Callable[[CommandCallback[CogT, ContextT, P, T]], GroupT]: ...
 
     @overload
     def group(

disnake/ext/commands/core.py~L1337

         name: str = ...,
         *args: Any,
         **kwargs: Any,
-    ) -> Callable[[CommandCallback[CogT, ContextT, P, T]], Group[CogT, P, T]]:
-        ...
+    ) -> Callable[[CommandCallback[CogT, ContextT, P, T]], Group[CogT, P, T]]: ...
 
     def group(
         self,

disnake/ext/commands/core.py~L1485

         @overload
         def __call__(
             self, func: Callable[Concatenate[ContextT, P], Coro[T]]
-        ) -> Command[None, P, T]:
-            ...
+        ) -> Command[None, P, T]: ...
 
         @overload
         def __call__(
             self, func: Callable[Concatenate[CogT, ContextT, P], Coro[T]]
-        ) -> Command[CogT, P, T]:
-            ...
+        ) -> Command[CogT, P, T]: ...
 
     class GroupDecorator(Protocol):
         @overload
-        def __call__(self, func: Callable[Concatenate[ContextT, P], Coro[T]]) -> Group[None, P, T]:
-            ...
+        def __call__(
+            self, func: Callable[Concatenate[ContextT, P], Coro[T]]
+        ) -> Group[None, P, T]: ...
 
         @overload
         def __call__(
             self, func: Callable[Concatenate[CogT, ContextT, P], Coro[T]]
-        ) -> Group[CogT, P, T]:
-            ...
+        ) -> Group[CogT, P, T]: ...
 
 
 # Small explanation regarding these overloads:

disnake/ext/commands/core.py~L1520

     name: str,
     cls: Type[CommandT],
     **attrs: Any,
-) -> Callable[[CommandCallback[CogT, ContextT, P, T]], CommandT]:
-    ...
+) -> Callable[[CommandCallback[CogT, ContextT, P, T]], CommandT]: ...
 
 
 @overload

disnake/ext/commands/core.py~L1530

     *,
     cls: Type[CommandT],
     **attrs: Any,
-) -> Callable[[CommandCallback[CogT, ContextT, P, T]], CommandT]:
-    ...
+) -> Callable[[CommandCallback[CogT, ContextT, P, T]], CommandT]: ...
 
 
 @overload
 def command(
     name: str = ...,
     **attrs: Any,
-) -> CommandDecorator:
-    ...
+) -> CommandDecorator: ...
 
 
 def command(

disnake/ext/commands/core.py~L1592

     name: str,
     cls: Type[GroupT],
     **attrs: Any,
-) -> Callable[[CommandCallback[CogT, ContextT, P, T]], GroupT]:
-    ...
+) -> Callable[[CommandCallback[CogT, ContextT, P, T]], GroupT]: ...
 
 
 @overload

disnake/ext/commands/core.py~L1602

     *,
     cls: Type[GroupT],
     **attrs: Any,
-) -> Callable[[CommandCallback[CogT, ContextT, P, T]], GroupT]:
-    ...
+) -> Callable[[CommandCallback[CogT, ContextT, P, T]], GroupT]: ...
 
 
 @overload
 def group(
     name: str = ...,
     **attrs: Any,
-) -> GroupDecorator:
-    ...
+) -> GroupDecorator: ...
 
 
 def group(

disnake/ext/commands/core.py~L2051

     view_channel: bool = ...,
     view_creator_monetization_analytics: bool = ...,
     view_guild_insights: bool = ...,
-) -> Callable[[T], T]:
-    ...
+) -> Callable[[T], T]: ...
 
 
 @overload
 @_generated
-def has_permissions() -> Callable[[T], T]:
-    ...
+def has_permissions() -> Callable[[T], T]: ...
 
 
 @_overload_with_permissions

disnake/ext/commands/core.py~L2175

     view_channel: bool = ...,
     view_creator_monetization_analytics: bool = ...,
     view_guild_insights: bool = ...,
-) -> Callable[[T], T]:
-    ...
+) -> Callable[[T], T]: ...
 
 
 @overload
 @_generated
-def bot_has_permissions() -> Callable[[T], T]:
-    ...
+def bot_has_permissions() -> Callable[[T], T]: ...
 
 
 @_overload_with_permissions

disnake/ext/commands/core.py~L2277

     view_channel: bool = ...,
     view_creator_monetization_analytics: bool = ...,
     view_guild_insights: bool = ...,
-) -> Callable[[T], T]:
-    ...
+) -> Callable[[T], T]: ...
 
 
 @overload
 @_generated
-def has_guild_permissions() -> Callable[[T], T]:
-    ...
+def has_guild_permissions() -> Callable[[T], T]: ...
 
 
 @_overload_with_permissions

disnake/ext/commands/core.py~L2376

     view_channel: bool = ...,
     view_creator_monetization_analytics: bool = ...,
     view_guild_insights: bool = ...,
-) -> Callable[[T], T]:
-    ...
+) -> Callable[[T], T]: ...
 
 
 @overload
 @_generated
-def bot_has_guild_permissions() -> Callable[[T], T]:
-    ...
+def bot_has_guild_permissions() -> Callable[[T], T]: ...
 
 
 @_overload_with_permissions

disnake/guild.py~L2473

         description: str = ...,
         image: AssetBytes = ...,
         reason: Optional[str] = ...,
-    ) -> GuildScheduledEvent:
-        ...
+    ) -> GuildScheduledEvent: ...
 
     @overload
     async def create_scheduled_event(

disnake/guild.py~L2492

         description: str = ...,
         image: AssetBytes = ...,
         reason: Optional[str] = ...,
-    ) -> GuildScheduledEvent:
-        ...
+    ) -> GuildScheduledEvent: ...
 
     @overload
     async def create_scheduled_event(

disnake/guild.py~L2509

         description: str = ...,
         image: AssetBytes = ...,
         reason: Optional[str] = ...,
-    ) -> GuildScheduledEvent:
-        ...
+    ) -> GuildScheduledEvent: ...
 
     async def create_scheduled_event(
         self,

disnake/guild.py~L3605

     @overload
     async def get_or_fetch_member(
         self, member_id: int, *, strict: Literal[False] = ...
-    ) -> Optional[Member]:
-        ...
+    ) -> Optional[Member]: ...
 
     @overload
-    async def get_or_fetch_member(self, member_id: int, *, strict: Literal[True]) -> Member:
-        ...
+    async def get_or_fetch_member(self, member_id: int, *, strict: Literal[True]) -> Member: ...
 
     async def get_or_fetch_member(
         self, member_id: int, *, strict: bool = False

disnake/guild.py~L3665

         icon: AssetBytes = ...,
         emoji: str = ...,
         mentionable: bool = ...,
-    ) -> Role:
-        ...
+    ) -> Role: ...
 
     @overload
     async def create_role(

disnake/guild.py~L3680

         icon: AssetBytes = ...,
         emoji: str = ...,
         mentionable: bool = ...,
-    ) -> Role:
-        ...
+    ) -> Role: ...
 
     async def create_role(
         self,

disnake/guild.py~L3887

         *,
         clean_history_duration: Union[int, datetime.timedelta] = 86400,
         reason: Optional[str] = None,
-    ) -> None:
-        ...
+    ) -> None: ...
 
     @overload
     async def ban(

disnake/guild.py~L3897

         *,
         delete_message_days: Literal[0, 1, 2, 3, 4, 5, 6, 7] = 1,
         reason: Optional[str] = None,
-    ) -> None:
-        ...
+    ) -> None: ...
 
     async def ban(
         self,

disnake/guild.py~L4628

         *,
         duration: Optional[Union[float, datetime.timedelta]],
         reason: Optional[str] = None,
-    ) -> Member:
-        ...
+    ) -> Member: ...
 
     @overload
     async def timeout(

disnake/guild.py~L4638

         *,
         until: Optional[datetime.datetime],
         reason: Optional[str] = None,
-    ) -> Member:
-        ...
+    ) -> Member: ...
 
     async def timeout(
         self,

disnake/member.py~L301

         data: Union[MemberWithUserPayload, GuildMemberUpdateEvent],
         guild: Guild,
         state: ConnectionState,
-    ) -> None:
-        ...
+    ) -> None: ...
 
     @overload
     def __init__(

disnake/member.py~L312

         guild: Guild,
         state: ConnectionState,
         user_data: UserPayload,
-    ) -> None:
-        ...
+    ) -> None: ...
 
     def __init__(
         self,

disnake/member.py~L725

         *,
         clean_history_duration: Union[int, datetime.timedelta] = 86400,
         reason: Optional[str] = None,
-    ) -> None:
-        ...
+    ) -> None: ...
 
     @overload
     async def ban(

disnake/member.py~L734

         *,
         delete_message_days: Literal[0, 1, 2, 3, 4, 5, 6, 7] = 1,
         reason: Optional[str] = None,
-    ) -> None:
-        ...
+    ) -> None: ...
 
     async def ban(
         self,

disnake/member.py~L1105

         *,
         duration: Optional[Union[float, datetime.timedelta]],
         reason: Optional[str] = None,
-    ) -> Member:
-        ...
+    ) -> Member: ...
 
     @overload
     async def timeout(

disnake/member.py~L1114

         *,
         until: Optional[datetime.datetime],
         reason: Optional[str] = None,
-    ) -> Member:
-        ...
+    ) -> Member: ...
 
     async def timeout(
         self,

disnake/state.py~L1376

     @overload
     async def chunk_guild(
         self, guild: Guild, *, wait: Literal[False], cache: Optional[bool] = None
-    ) -> asyncio.Future[List[Member]]:
-        ...
+    ) -> asyncio.Future[List[Member]]: ...
 
     @overload
     async def chunk_guild(
         self, guild: Guild, *, wait: Literal[True] = True, cache: Optional[bool] = None
-    ) -> List[Member]:
-        ...
+    ) -> List[Member]: ...
 
     async def chunk_guild(
         self, guild: Guild, *, wait: bool = True, cache: Optional[bool] = None

PostHog/HouseWatch (+9 -9 lines across 1 file)

housewatch/clickhouse/backups.py~L38

     for shard, node in nodes:
         params["shard"] = shard
         if base_backup:
-            query_settings[
-                "base_backup"
-            ] = f"S3('{base_backup}/{shard}', '{aws_key}', '{aws_secret}')"
+            query_settings["base_backup"] = (
+                f"S3('{base_backup}/{shard}', '{aws_key}', '{aws_secret}')"
+            )
         final_query = query % (params or {}) if substitute_params else query
         client = Client(
             host=node["host_address"],

housewatch/clickhouse/backups.py~L129

     TO S3('https://%(bucket)s.s3.amazonaws.com/%(path)s', '%(aws_key)s', '%(aws_secret)s')
     ASYNC"""
     if base_backup:
-        query_settings[
-            "base_backup"
-        ] = f"S3('{base_backup}', '{aws_key}', '{aws_secret}')"
+        query_settings["base_backup"] = (
+            f"S3('{base_backup}', '{aws_key}', '{aws_secret}')"
+        )
     return run_query(
         QUERY,
         {

housewatch/clickhouse/backups.py~L184

                 TO S3('https://%(bucket)s.s3.amazonaws.com/%(path)s', '%(aws_key)s', '%(aws_secret)s')
                 ASYNC"""
     if base_backup:
-        query_settings[
-            "base_backup"
-        ] = f"S3('{base_backup}', '{aws_key}', '{aws_secret}')"
+        query_settings["base_backup"] = (
+            f"S3('{base_backup}', '{aws_key}', '{aws_secret}')"
+        )
     return run_query(
         QUERY,
         {

RasaHQ/rasa (+85 -88 lines across 15 files)

rasa/core/channels/console.py~L103

 
 
 @overload
-async def _get_user_input(previous_response: None) -> Text:
-    ...
+async def _get_user_input(previous_response: None) -> Text: ...
 
 
 @overload
-async def _get_user_input(previous_response: Dict[str, Any]) -> Optional[Text]:
-    ...
+async def _get_user_input(previous_response: Dict[str, Any]) -> Optional[Text]: ...
 
 
 async def _get_user_input(

rasa/core/policies/rule_policy.py~L774

         trackers_as_actions = rule_trackers_as_actions + story_trackers_as_actions
 
         # negative rules are not anti-rules, they are auxiliary to actual rules
-        self.lookup[
-            RULES_FOR_LOOP_UNHAPPY_PATH
-        ] = self._create_loop_unhappy_lookup_from_states(
-            trackers_as_states, trackers_as_actions
+        self.lookup[RULES_FOR_LOOP_UNHAPPY_PATH] = (
+            self._create_loop_unhappy_lookup_from_states(
+                trackers_as_states, trackers_as_actions
+            )
         )
 
     def train(

rasa/core/policies/ted_policy.py~L1266

             )
             self._prepare_encoding_layers(name)
 
-        self._tf_layers[
-            f"transformer.{DIALOGUE}"
-        ] = rasa_layers.prepare_transformer_layer(
-            attribute_name=DIALOGUE,
-            config=self.config,
-            num_layers=self.config[NUM_TRANSFORMER_LAYERS][DIALOGUE],
-            units=self.config[TRANSFORMER_SIZE][DIALOGUE],
-            drop_rate=self.config[DROP_RATE_DIALOGUE],
-            # use bidirectional transformer, because
-            # we will invert dialogue sequence so that the last turn is located
-            # at the first position and would always have
-            # exactly the same positional encoding
-            unidirectional=not self.max_history_featurizer_is_used,
+        self._tf_layers[f"transformer.{DIALOGUE}"] = (
+            rasa_layers.prepare_transformer_layer(
+                attribute_name=DIALOGUE,
+                config=self.config,
+                num_layers=self.config[NUM_TRANSFORMER_LAYERS][DIALOGUE],
+                units=self.config[TRANSFORMER_SIZE][DIALOGUE],
+                drop_rate=self.config[DROP_RATE_DIALOGUE],
+                # use bidirectional transformer, because
+                # we will invert dialogue sequence so that the last turn is located
+                # at the first position and would always have
+                # exactly the same positional encoding
+                unidirectional=not self.max_history_featurizer_is_used,
+            )
         )
 
         self._prepare_label_classification_layers(DIALOGUE)

rasa/core/policies/ted_policy.py~L1308

         # Attributes with sequence-level features also have sentence-level features,
         # all these need to be combined and further processed.
         if attribute_name in SEQUENCE_FEATURES_TO_ENCODE:
-            self._tf_layers[
-                f"sequence_layer.{attribute_name}"
-            ] = rasa_layers.RasaSequenceLayer(
-                attribute_name, attribute_signature, config_to_use
+            self._tf_layers[f"sequence_layer.{attribute_name}"] = (
+                rasa_layers.RasaSequenceLayer(
+                    attribute_name, attribute_signature, config_to_use
+                )
             )
         # Attributes without sequence-level features require some actual feature
         # processing only if they have sentence-level features. Attributes with no
         # sequence- and sentence-level features (dialogue, entity_tags, label) are
         # skipped here.
         elif SENTENCE in attribute_signature:
-            self._tf_layers[
-                f"sparse_dense_concat_layer.{attribute_name}"
-            ] = rasa_layers.ConcatenateSparseDenseFeatures(
-                attribute=attribute_name,
-                feature_type=SENTENCE,
-                feature_type_signature=attribute_signature[SENTENCE],
-                config=config_to_use,
+            self._tf_layers[f"sparse_dense_concat_layer.{attribute_name}"] = (
+                rasa_layers.ConcatenateSparseDenseFeatures(
+                    attribute=attribute_name,
+                    feature_type=SENTENCE,
+                    feature_type_signature=attribute_signature[SENTENCE],
+                    config=config_to_use,
+                )
             )
 
     def _prepare_encoding_layers(self, name: Text) -> None:

rasa/engine/recipes/default_recipe.py~L150

             else:
                 unique_types = set(component_types)
 
-            cls._registered_components[
-                registered_class.__name__
-            ] = cls.RegisteredComponent(
-                registered_class, unique_types, is_trainable, model_from
+            cls._registered_components[registered_class.__name__] = (
+                cls.RegisteredComponent(
+                    registered_class, unique_types, is_trainable, model_from
+                )
             )
             return registered_class
 

rasa/graph_components/validators/default_recipe_validator.py~L291

         Both of these look for the same entities based on the same training data
         leading to ambiguity in the results.
         """
-        extractors_in_configuration: Set[
-            Type[GraphComponent]
-        ] = self._component_types.intersection(TRAINABLE_EXTRACTORS)
+        extractors_in_configuration: Set[Type[GraphComponent]] = (
+            self._component_types.intersection(TRAINABLE_EXTRACTORS)
+        )
         if len(extractors_in_configuration) > 1:
             rasa.shared.utils.io.raise_warning(
                 f"You have defined multiple entity extractors that do the same job "

rasa/nlu/classifiers/diet_classifier.py~L1446

         # everything using a transformer and optionally also do masked language
         # modeling.
         self.text_name = TEXT
-        self._tf_layers[
-            f"sequence_layer.{self.text_name}"
-        ] = rasa_layers.RasaSequenceLayer(
-            self.text_name, self.data_signature[self.text_name], self.config
+        self._tf_layers[f"sequence_layer.{self.text_name}"] = (
+            rasa_layers.RasaSequenceLayer(
+                self.text_name, self.data_signature[self.text_name], self.config
+            )
         )
         if self.config[MASKED_LM]:
             self._prepare_mask_lm_loss(self.text_name)

rasa/nlu/classifiers/diet_classifier.py~L1467

                 {SPARSE_INPUT_DROPOUT: False, DENSE_INPUT_DROPOUT: False}
             )
 
-            self._tf_layers[
-                f"feature_combining_layer.{self.label_name}"
-            ] = rasa_layers.RasaFeatureCombiningLayer(
-                self.label_name, self.label_signature[self.label_name], label_config
+            self._tf_layers[f"feature_combining_layer.{self.label_name}"] = (
+                rasa_layers.RasaFeatureCombiningLayer(
+                    self.label_name, self.label_signature[self.label_name], label_config
+                )
             )
 
             self._prepare_ffnn_layer(

rasa/nlu/featurizers/sparse_featurizer/lexical_syntactic_featurizer.py~L338

 
                 token = tokens[absolute_position]
                 for feature_name in self._feature_config[window_position]:
-                    token_features[
-                        (window_position, feature_name)
-                    ] = self._extract_raw_features_from_token(
-                        token=token,
-                        feature_name=feature_name,
-                        token_position=absolute_position,
-                        num_tokens=len(tokens),
+                    token_features[(window_position, feature_name)] = (
+                        self._extract_raw_features_from_token(
+                            token=token,
+                            feature_name=feature_name,
+                            token_position=absolute_position,
+                            num_tokens=len(tokens),
+                        )
                     )
 
             sentence_features.append(token_features)

rasa/nlu/selectors/response_selector.py~L430

         self, message: Message, prediction_dict: Dict[Text, Any], selector_key: Text
     ) -> None:
         message_selector_properties = message.get(RESPONSE_SELECTOR_PROPERTY_NAME, {})
-        message_selector_properties[
-            RESPONSE_SELECTOR_RETRIEVAL_INTENTS
-        ] = self.all_retrieval_intents
+        message_selector_properties[RESPONSE_SELECTOR_RETRIEVAL_INTENTS] = (
+            self.all_retrieval_intents
+        )
         message_selector_properties[selector_key] = prediction_dict
         message.set(
             RESPONSE_SELECTOR_PROPERTY_NAME,

rasa/nlu/selectors/response_selector.py~L793

             (self.text_name, self.config),
             (self.label_name, label_config),
         ]:
-            self._tf_layers[
-                f"sequence_layer.{attribute}"
-            ] = rasa_layers.RasaSequenceLayer(
-                attribute, self.data_signature[attribute], config
+            self._tf_layers[f"sequence_layer.{attribute}"] = (
+                rasa_layers.RasaSequenceLayer(
+                    attribute, self.data_signature[attribute], config
+                )
             )
 
         if self.config[MASKED_LM]:

rasa/shared/core/domain.py~L1500

                 if not response_text or "\n" not in response_text:
                     continue
                 # Has new lines, use `LiteralScalarString`
-                final_responses[utter_action][i][
-                    KEY_RESPONSES_TEXT
-                ] = LiteralScalarString(response_text)
+                final_responses[utter_action][i][KEY_RESPONSES_TEXT] = (
+                    LiteralScalarString(response_text)
+                )
 
         return final_responses
 

rasa/shared/nlu/training_data/formats/rasa_yaml.py~L529

             )
 
             if examples_have_metadata or example_texts_have_escape_chars:
-                intent[
-                    key_examples
-                ] = RasaYAMLWriter._render_training_examples_as_objects(converted)
+                intent[key_examples] = (
+                    RasaYAMLWriter._render_training_examples_as_objects(converted)
+                )
             else:
                 intent[key_examples] = RasaYAMLWriter._render_training_examples_as_text(
                     converted

rasa/utils/tensorflow/models.py~L324

                 # We only need input, since output is always None and not
                 # consumed by our TF graphs.
                 batch_in = next(data_iterator)[0]
-                batch_out: Dict[
-                    Text, Union[np.ndarray, Dict[Text, Any]]
-                ] = self._rasa_predict(batch_in)
+                batch_out: Dict[Text, Union[np.ndarray, Dict[Text, Any]]] = (
+                    self._rasa_predict(batch_in)
+                )
                 if output_keys_expected:
                     batch_out = {
                         key: output

rasa/utils/tensorflow/rasa_layers.py~L444

         for feature_type, present in self._feature_types_present.items():
             if not present:
                 continue
-            self._tf_layers[
-                f"sparse_dense.{feature_type}"
-            ] = ConcatenateSparseDenseFeatures(
-                attribute=attribute,
-                feature_type=feature_type,
-                feature_type_signature=attribute_signature[feature_type],
-                config=config,
+            self._tf_layers[f"sparse_dense.{feature_type}"] = (
+                ConcatenateSparseDenseFeatures(
+                    attribute=attribute,
+                    feature_type=feature_type,
+                    feature_type_signature=attribute_signature[feature_type],
+                    config=config,
+                )
             )
 
     def _prepare_sequence_sentence_concat(

rasa/utils/tensorflow/rasa_layers.py~L853

                 [not signature.is_sparse for signature in attribute_signature[SEQUENCE]]
             )
             if not expect_dense_seq_features:
-                self._tf_layers[
-                    self.SPARSE_TO_DENSE_FOR_TOKEN_IDS
-                ] = layers.DenseForSparse(
-                    units=2,
-                    use_bias=False,
-                    trainable=False,
-                    name=f"{self.SPARSE_TO_DENSE_FOR_TOKEN_IDS}.{attribute}",
+                self._tf_layers[self.SPARSE_TO_DENSE_FOR_TOKEN_IDS] = (
+                    layers.DenseForSparse(
+                        units=2,
+                        use_bias=False,
+                        trainable=False,
+                        name=f"{self.SPARSE_TO_DENSE_FOR_TOKEN_IDS}.{attribute}",
+                    )
                 )
 
     def _calculate_output_units(

scripts/evaluate_release_tag.py~L1

-"""Evaluate release tag for whether docs should be built or not.
+"""Evaluate release tag for whether docs should be built or not."""
 
-"""
 import argparse
 from subprocess import check_output
 from typing import List

scripts/prepare_nightly_release.py~L2

 
 - increases the version number to dev version provided
 """
+
 import argparse
 import os
 import sys

tests/engine/recipes/test_default_recipe.py~L412

     class MyClassGraphComponent(GraphComponent):
         def process(
             self, messages: List[Message], tracker: DialogueStateTracker
-        ) -> List[Message]:
-            ...
+        ) -> List[Message]: ...
 
     config = rasa.shared.utils.io.read_yaml(
         """

Snowflake-Labs/snowcli (+3 -3 lines across 1 file)

src/snowflake/cli/app/commands_registration/command_plugins_loader.py~L81

             )
             return None
         self._loaded_plugins[plugin_name] = loaded_plugin
-        self._loaded_command_paths[
-            loaded_plugin.command_spec.full_command_path
-        ] = loaded_plugin
+        self._loaded_command_paths[loaded_plugin.command_spec.full_command_path] = (
+            loaded_plugin
+        )
         return loaded_plugin
 
     def _load_plugin_spec(

aiven/aiven-client (+1 -2 lines across 1 file)

aiven/client/cli.py~L130

 
 
 class ClientFactory(Protocol):
-    def __call__(self, base_url: str, show_http: bool, request_timeout: int | None) -> client.AivenClient:
-        ...
+    def __call__(self, base_url: str, show_http: bool, request_timeout: int | None) -> client.AivenClient: ...
 
 
 class AivenCLI(argx.CommandLineTool):

aws/aws-sam-cli (+22 -26 lines across 8 files)

samcli/cli/cli_config_file.py~L406

     def decorator_configuration_setup(f):
         configuration_setup_params = ()
         configuration_setup_attrs = {}
-        configuration_setup_attrs[
-            "help"
-        ] = "This is a hidden click option whose callback function loads configuration parameters."
+        configuration_setup_attrs["help"] = (
+            "This is a hidden click option whose callback function loads configuration parameters."
+        )
         configuration_setup_attrs["is_eager"] = True
         configuration_setup_attrs["expose_value"] = False
         configuration_setup_attrs["hidden"] = True

samcli/cli/global_config.py~L155

         value_type: Type[bool],
         is_flag: bool,
         reload_config: bool = False,
-    ) -> bool:
-        ...
+    ) -> bool: ...
 
     # Overload for case where type is specified
     @overload

samcli/cli/global_config.py~L167

         value_type: Type[T] = T,  # type: ignore
         is_flag: bool = False,
         reload_config: bool = False,
-    ) -> Optional[T]:
-        ...
+    ) -> Optional[T]: ...
 
     # Overload for case where type is not specified and default to object
     @overload

samcli/cli/global_config.py~L179

         value_type: object = object,
         is_flag: bool = False,
         reload_config: bool = False,
-    ) -> Any:
-        ...
+    ) -> Any: ...
 
     def get_value(
         self,

samcli/commands/_utils/options.py~L768

     def hook_name_processer_wrapper(f):
         configuration_setup_params = ()
         configuration_setup_attrs = {}
-        configuration_setup_attrs[
-            "help"
-        ] = "This is a hidden click option whose callback function to run the provided hook package."
+        configuration_setup_attrs["help"] = (
+            "This is a hidden click option whose callback function to run the provided hook package."
+        )
         configuration_setup_attrs["is_eager"] = True
         configuration_setup_attrs["expose_value"] = False
         configuration_setup_attrs["hidden"] = True

samcli/hook_packages/terraform/hooks/prepare/resource_linking.py~L158

     cfn_resource_update_call_back_function: Callable[[Dict, List[ReferenceType]], None]
     linking_exceptions: ResourcePairExceptions
     # function to extract the terraform destination value from the linking field value
-    tf_destination_value_extractor_from_link_field_value_function: Callable[
-        [str], str
-    ] = _default_tf_destination_value_id_extractor
+    tf_destination_value_extractor_from_link_field_value_function: Callable[[str], str] = (
+        _default_tf_destination_value_id_extractor
+    )
 
 
 class ResourceLinker:

samcli/lib/list/endpoints/endpoints_producer.py~L469

             resource.get(RESOURCE_TYPE, "") == AWS_APIGATEWAY_DOMAIN_NAME
             or resource.get(RESOURCE_TYPE, "") == AWS_APIGATEWAY_V2_DOMAIN_NAME
         ):
-            response_domain_dict[
-                resource.get(LOGICAL_RESOURCE_ID, "")
-            ] = f'https://{resource.get(PHYSICAL_RESOURCE_ID, "")}'
+            response_domain_dict[resource.get(LOGICAL_RESOURCE_ID, "")] = (
+                f'https://{resource.get(PHYSICAL_RESOURCE_ID, "")}'
+            )
     return response_domain_dict
 
 

samcli/lib/utils/boto_utils.py~L38

 
 # Type definition of following boto providers, which is equal to Callable[[str], Any]
 class BotoProviderType(Protocol):
-    def __call__(self, service_name: str) -> Any:
-        ...  # pragma: no cover
+    def __call__(self, service_name: str) -> Any: ...  # pragma: no cover
 
 
 def get_boto_client_provider_from_session_with_config(session: Session, **kwargs) -> BotoProviderType:

samcli/yamlhelper.py~L119

         # json parser.
         return cast(Dict, json.loads(yamlstr, object_pairs_hook=OrderedDict))
     except ValueError:
-        yaml.constructor.SafeConstructor.yaml_constructors[
-            TIMESTAMP_TAG
-        ] = yaml.constructor.SafeConstructor.yaml_constructors[TAG_STR]
+        yaml.constructor.SafeConstructor.yaml_constructors[TIMESTAMP_TAG] = (
+            yaml.constructor.SafeConstructor.yaml_constructors[TAG_STR]
+        )
         yaml.SafeLoader.add_constructor(yaml.resolver.BaseResolver.DEFAULT_MAPPING_TAG, _dict_constructor)
         yaml.SafeLoader.add_multi_constructor("!", intrinsics_multi_constructor)
         return cast(Dict, yaml.safe_load(yamlstr))

tests/integration/buildcmd/test_build_terraform_applications.py~L80

             command_list_parameters["use_container"] = True
             command_list_parameters["build_image"] = self.docker_tag
             if self.override:
-                command_list_parameters[
-                    "container_env_var"
-                ] = "TF_VAR_HELLO_FUNCTION_SRC_CODE=./artifacts/HelloWorldFunction2"
+                command_list_parameters["container_env_var"] = (
+                    "TF_VAR_HELLO_FUNCTION_SRC_CODE=./artifacts/HelloWorldFunction2"
+                )
 
         environment_variables = os.environ.copy()
         if self.override:

bloomberg/pytest-memray (+1 -2 lines across 1 file)

src/pytest_memray/plugin.py~L55

         _config: Config,
         _test_id: str,
         **kwargs: Any,
-    ) -> SectionMetadata | None:
-        ...
+    ) -> SectionMetadata | None: ...
 
 
 MARKERS = {

bokeh/bokeh (+754 -1029 lines across 518 files)

examples/advanced/extensions/widget.py~L1

 """Example implementation of two double ended sliders as extension widgets"""
+
 from bokeh.core.properties import Bool, Float, Tuple
 from bokeh.io import show
 from bokeh.layouts import column

examples/basic/annotations/arrow.py~L1

-""" A demonstration of configuring different arrow types.
+"""A demonstration of configuring different arrow types.
 
 .. bokeh-example-metadata::
     :apis: bokeh.plotting.figure.circle, bokeh.plotting.figure.add_layout

examples/basic/annotations/arrow.py~L6

     :keywords: arrows
 
 """
+
 from bokeh.models import Arrow, NormalHead, OpenHead, VeeHead
 from bokeh.palettes import Muted3 as color
 from bokeh.plotting import figure, show

examples/basic/annotations/arrowheads.py~L1

-""" A display of available arrow head styles.
+"""A display of available arrow head styles.
 
 .. bokeh-example-metadata::
     :apis: bokeh.models.Plot, bokeh.models.Arrow, bokeh.models.Label

examples/basic/annotations/arrowheads.py~L6

     :keywords: arrows
 
 """
+
 from bokeh.models import Arrow, Label, NormalHead, OpenHead, Plot, Range1d, TeeHead, VeeHead
 from bokeh.plotting import show
 

examples/basic/annotations/band.py~L1

-""" An interactive numerical band plot based on simple Python array of data.
+"""An interactive numerical band plot based on simple Python array of data.
     It is a combination of scatter plots and line plots added with a band of covered area.
     The line passes through the mean of the area covered by the band.
 

examples/basic/annotations/band.py~L8

     :keywords: figure, scatter, line, band, layout
 
 """
+
 import numpy as np
 import pandas as pd
 

examples/basic/annotations/box_annotation.py~L1

-""" A timeseries plot of glucose data readings. This example demonstrates
+"""A timeseries plot of glucose data readings. This example demonstrates
 adding box annotations as well as a multi-line title.
 
 .. bokeh-example-metadata::

examples/basic/annotations/box_annotation.py~L8

     :keywords: box annotation, time series
 
 """
+
 from bokeh.models import BoxAnnotation
 from bokeh.plotting import figure, show
 from bokeh.sampledata.glucose import data

examples/basic/annotations/colorbar_log.py~L1

-""" A demonstration of a ColorBar with a log color scale.
+"""A demonstration of a ColorBar with a log color scale.
 
 .. bokeh-example-metadata::
     :apis: bokeh.models.ColorBar, bokeh.models.LogColorMapper

examples/basic/annotations/colorbar_log.py~L6

     :keywords: colorbar
 
 """
+
 import numpy as np
 
 from bokeh.models import LogColorMapper

examples/basic/annotations/label.py~L1

-""" A scatter plot that demonstrates different ways of adding labels.
+"""A scatter plot that demonstrates different ways of adding labels.
 
 .. bokeh-example-metadata::
     :apis: bokeh.models.ColumnDataSource, bokeh.models.Label, bokeh.models.LabelSet bokeh.plotting.figure.scatter

examples/basic/annotations/legend.py~L1

-""" Line and marker plots that demonstrate automatic legends.
+"""Line and marker plots that demonstrate automatic legends.
 
 .. bokeh-example-metadata::
     :apis: bokeh.layouts.gridplot, bokeh.plotting.figure.circle, bokeh.plotting.figure.line, bokeh.plotting.figure.square

examples/basic/annotations/legend.py~L6

     :keywords: gridplot
 
 """
+
 import numpy as np
 
 from bokeh.layouts import gridplot

examples/basic/annotations/legend_two_dimensions.py~L6

     :refs: :ref:`ug_basic_annotations_legends_two_dimensions`
     :keywords: legend
 """
+
 import numpy as np
 
 from bokeh.layouts import column

examples/basic/annotations/legends_item_visibility.py~L1

-""" Marker and line plots that demonstrate manual control of legend visibility of individual items
+"""Marker and line plots that demonstrate manual control of legend visibility of individual items
 
 .. bokeh-example-metadata::
     :apis: bokeh.models.LegendItem, bokeh.plotting.figure.circle, bokeh.plotting.figure.line

examples/basic/annotations/legends_item_visibility.py~L6

     :keywords: legend
 
 """
+
 import numpy as np
 
 from bokeh.plotting import figure, show

examples/basic/annotations/slope.py~L1

-""" A marker plot that demonstrates a slope.
+"""A marker plot that demonstrates a slope.
 
 .. bokeh-example-metadata::
     :apis: bokeh.models.slope, bokeh.plotting.figure.circle

examples/basic/annotations/slope.py~L6

     :keywords: slope
 
 """
+
 import numpy as np
 
 from bokeh.models import Slope

examples/basic/annotations/whisker.py~L1

-""" A marker plot that shows the relationship between car type and highway MPG from the autompg
+"""A marker plot that shows the relationship between car type and highway MPG from the autompg
 sample data. This example demonstrates the use of whiskers to display quantile ranges in the plot.
 
 .. bokeh-example-metadata::

examples/basic/areas/stacked_area.py~L1

-""" A stacked area plot using data from a pandas DataFrame.
+"""A stacked area plot using data from a pandas DataFrame.
 
 .. bokeh-example-metadata::
     :apis: bokeh.plotting.figure.varea_stack

examples/basic/areas/stacked_area.py~L6

     :keywords: area, pandas, stacked
 
 """
+
 import numpy as np
 import pandas as pd
 

examples/basic/axes/logplot.py~L1

-""" A log plot using functions with different growth rates. This example
+"""A log plot using functions with different growth rates. This example
 demonstrates using a log axis on a Bokeh plot. Various line styles and glyph
 combinations are automatically added to a legend.
 

examples/basic/axes/logplot.py~L8

     :keywords: lines, legend, log scale, scatter
 
 """
+
 import numpy as np
 
 from bokeh.plotting import figure, show

examples/basic/axes/twin_axes.py~L7

     :keywords: add_layout, axis, axis_label, axis_label_text_color, scatter, extra_y_ranges, LinearAxis
 
 """
+
 from numpy import arange, linspace, pi, sin
 
 from bokeh.layouts import column

examples/basic/bars/basic.py~L1

-""" A simple bar chart using plain Python lists.
+"""A simple bar chart using plain Python lists.
 
 .. bokeh-example-metadata::
     :apis: bokeh.plotting.figure.vbar

examples/basic/bars/basic.py~L6

     :keywords: bars, categorical
 
 """
+
 from bokeh.plotting import figure, show
 
 fruits = ["Apples", "Pears", "Nectarines", "Plums", "Grapes", "Strawberries"]

examples/basic/bars/colormapped.py~L1

-""" A bar chart based on simple Python lists of data. This example demonstrates
+"""A bar chart based on simple Python lists of data. This example demonstrates
 automatic colormapping.
 
 .. bokeh-example-metadata::

examples/basic/bars/colormapped.py~L7

     :keywords: bar, colormap, legend, palette, vbar
 
 """
+
 from bokeh.models import ColumnDataSource
 from bokeh.palettes import Bright6
 from bokeh.plotting import figure, show

examples/basic/bars/colors.py~L1

-""" A simple bar chart using plain Python lists. This example demonstrates
+"""A simple bar chart using plain Python lists. This example demons...*[Comment body truncated]*

@MichaReiser
Copy link
Member Author

I reviewed the changes introduced by the new style by formatting a few large projects. Most style changes show up infrequently (or even never). The following three changes result in many changes and can be very disruptive when upgrading.

  • module_docstring_newlines: The change itself isn't significant but it affects many files
    • Changes 2200 files out of 4500 on the Airflow repository
  • hug_parens_with_braces_and_square_brackets: It's a fairly invasive change because it changes many files and the changes itself are large. The style is a clear improvement, but it might no longer be fair to say that Ruff is a drop in replacement after shipping it. I can't think of a way to make the change less invasive without loosing consistent formatting.
    • Airflow :Changes about 330 out of the 4500 files with 5954 insertions(+) and 7145 deletions(-). A clear win when it comes to the used vertical spacing
    • HuggingFace (transformers): Changes about 500 out of the 3000 files, with 12k insertions and 14k deletions
  • Format module docstrings: This does the right thing, but it leads to many changes in some repositories.
    • HuggingFace: Changes 813 files out of 3000.

@MichaReiser MichaReiser force-pushed the ruff-2024-style branch 2 times, most recently from 9e0a769 to 18dab48 Compare February 12, 2024 18:15
@MichaReiser MichaReiser changed the title Promote formatter preview styles to stable Ruff 2024 style Feb 12, 2024
@andersk
Copy link
Contributor

andersk commented Feb 12, 2024

It seems to me that this should be split in two: Ruff should quickly stabilize the features corresponding to the Black 2024 stable style, leaving more time to consider the other more controversial features. Although I’m personally a fan of all three of them, I think maintaining Black compatibility by default is more important for now.

@MichaReiser
Copy link
Member Author

MichaReiser commented Feb 14, 2024

We discussed this internally and agree with @andersk that we want to keep close compatibility with Black for now. We don't mind shipping improvements that produce small or no diffs when migrating from Black to Ruff (not necessarily the other way around). This includes the following preview changes that Black did not ship

  • multiline_string
  • hex_codes_in_unicode_sequences (they intended to ship this but missed removing the feature flag)
  • format_module_docstring: This is a bug and prevents the docstring from formatting code blocks

We don't plan on shipping is_hug_parens_with_braces_and_square_brackets because it produces a large diff. We'll revisit this decision when we plan to release our next style improvements (we hope to innovate on styles that go beyond what Black does)

@MichaReiser MichaReiser added this to the v0.3.0 milestone Feb 14, 2024
@MichaReiser MichaReiser marked this pull request as ready for review February 28, 2024 18:20
# leading function comment
-def decorated1():
- ...
+def decorated1(): ...
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to update our test snapshots to pull in the stable black formattings.

@MichaReiser MichaReiser changed the title Ruff 2024 style Ruff 2024.2 style Feb 29, 2024
@MichaReiser MichaReiser merged commit a6f32dd into main Feb 29, 2024
17 checks passed
@MichaReiser MichaReiser deleted the ruff-2024-style branch February 29, 2024 08:30
nkxxll pushed a commit to nkxxll/ruff that referenced this pull request Mar 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
breaking Breaking API change formatter Related to the formatter
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants