Skip to content

Commit

Permalink
pythongh-102856: Add changes related to PEP 701 in 3.12 What's New do…
Browse files Browse the repository at this point in the history
…cs (python#104824)

Co-authored-by: Pablo Galindo Salgado <Pablogsal@gmail.com>
Co-authored-by: Jelle Zijlstra <jelle.zijlstra@gmail.com>
  • Loading branch information
3 people authored May 24, 2023
1 parent 7b00940 commit c45701e
Showing 1 changed file with 135 additions and 16 deletions.
151 changes: 135 additions & 16 deletions Doc/whatsnew/3.12.rst
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,10 @@ Summary -- Release highlights
.. PEP-sized items next.
New grammar features:

* :pep:`701`: Syntactic formalization of f-strings

New typing features:

* :pep:`688`: Making the buffer protocol accessible in Python
Expand Down Expand Up @@ -136,22 +140,70 @@ Improved Error Messages
New Features
============

* Add :ref:`perf_profiling` through the new
environment variable :envvar:`PYTHONPERFSUPPORT`,
the new command-line option :option:`-X perf <-X>`,
as well as the new :func:`sys.activate_stack_trampoline`,
:func:`sys.deactivate_stack_trampoline`,
and :func:`sys.is_stack_trampoline_active` APIs.
(Design by Pablo Galindo. Contributed by Pablo Galindo and Christian Heimes
with contributions from Gregory P. Smith [Google] and Mark Shannon
in :gh:`96123`.)
* The extraction methods in :mod:`tarfile`, and :func:`shutil.unpack_archive`,
have a new a *filter* argument that allows limiting tar features than may be
surprising or dangerous, such as creating files outside the destination
directory.
See :ref:`tarfile-extraction-filter` for details.
In Python 3.14, the default will switch to ``'data'``.
(Contributed by Petr Viktorin in :pep:`706`.)
.. _whatsnew312-pep701:

PEP 701: Syntactic formalization of f-strings
---------------------------------------------

:pep:`701` lifts some restrictions on the usage of f-strings. Expression components
inside f-strings can now be any valid Python expression including backslashes,
unicode escaped sequences, multi-line expressions, comments and strings reusing the
same quote as the containing f-string. Let's cover these in detail:

* Quote reuse: in Python 3.11, reusing the same quotes as the containing f-string
raises a :exc:`SyntaxError`, forcing the user to either use other available
quotes (like using double quotes or triple quotes if the f-string uses single
quotes). In Python 3.12, you can now do things like this:

>>> songs = ['Take me back to Eden', 'Alkaline', 'Ascensionism']
>>> f"This is the playlist: {", ".join(songs)}"
'This is the playlist: Take me back to Eden, Alkaline, Ascensionism

Note that before this change there was no explicit limit in how f-strings can
be nested, but the fact that string quotes cannot be reused inside the
expression component of f-strings made it impossible to nest f-strings
arbitrarily. In fact, this is the most nested f-string that could be written:

>>> f"""{f'''{f'{f"{1+1}"}'}'''}"""
'2'

As now f-strings can contain any valid Python expression inside expression
components, it is now possible to nest f-strings arbitrarily:

>>> f"{f"{f"{f"{f"{f"{1+1}"}"}"}"}"}"
'2'

* Multi-line expressions and comments: In Python 3.11, f-strings expressions
must be defined in a single line even if outside f-strings expressions could
span multiple lines (like literal lists being defined over multiple lines),
making them harder to read. In Python 3.12 you can now define expressions
spaning multiple lines and include comments on them:

>>> f"This is the playlist: {", ".join([
... 'Take me back to Eden', # My, my, those eyes like fire
... 'Alkaline', # Not acid nor alkaline
... 'Ascensionism' # Take to the broken skies at last
... ])}"
'This is the playlist: Take me back to Eden, Alkaline, Ascensionism'

* Backslashes and unicode characters: before Python 3.12 f-string expressions
couldn't contain any ``\`` character. This also affected unicode escaped
sequences (such as ``\N{snowman}``) as these contain the ``\N`` part that
previously could not be part of expression components of f-strings. Now, you
can define expressions like this:

>>> print(f"This is the playlist: {"\n".join(songs)}")
This is the playlist: Take me back to Eden
Alkaline
Ascensionism
>>> print(f"This is the playlist: {"\N{BLACK HEART SUIT}".join(songs)}")
This is the playlist: Take me back to Eden♥Alkaline♥Ascensionism

See :pep:`701` for more details.

(Contributed by Pablo Galindo, Batuhan Taskaya, Lysandros Nikolaou, Cristián
Maureira-Fredes and Marta Gómez in :gh:`102856`. PEP written by Pablo Galindo,
Batuhan Taskaya, Lysandros Nikolaou and Marta Gómez).

.. _whatsnew312-pep709:

Expand Down Expand Up @@ -223,6 +275,24 @@ See :pep:`692` for more details.
Other Language Changes
======================

* Add :ref:`perf_profiling` through the new
environment variable :envvar:`PYTHONPERFSUPPORT`,
the new command-line option :option:`-X perf <-X>`,
as well as the new :func:`sys.activate_stack_trampoline`,
:func:`sys.deactivate_stack_trampoline`,
and :func:`sys.is_stack_trampoline_active` APIs.
(Design by Pablo Galindo. Contributed by Pablo Galindo and Christian Heimes
with contributions from Gregory P. Smith [Google] and Mark Shannon
in :gh:`96123`.)

* The extraction methods in :mod:`tarfile`, and :func:`shutil.unpack_archive`,
have a new a *filter* argument that allows limiting tar features than may be
surprising or dangerous, such as creating files outside the destination
directory.
See :ref:`tarfile-extraction-filter` for details.
In Python 3.14, the default will switch to ``'data'``.
(Contributed by Petr Viktorin in :pep:`706`.)

* :class:`types.MappingProxyType` instances are now hashable if the underlying
mapping is hashable.
(Contributed by Serhiy Storchaka in :gh:`87995`.)
Expand Down Expand Up @@ -543,6 +613,14 @@ tkinter
like ``create_*()`` methods.
(Contributed by Serhiy Storchaka in :gh:`94473`.)

tokenize
--------

* The :mod:`tokenize` module includes the changes introduced in :pep:`701`. (
Contributed by Marta Gómez Macías and Pablo Galindo in :gh:`102856`.)
See :ref:`whatsnew312-porting-to-python312` for more information on the
changes to the :mod:`tokenize` module.

types
-----

Expand Down Expand Up @@ -687,6 +765,11 @@ Optimizations
* Speed up :class:`asyncio.Task` creation by deferring expensive string formatting.
(Contributed by Itamar O in :gh:`103793`.)

* The :func:`tokenize.tokenize` and :func:`tokenize.generate_tokens` functions are
up to 64% faster as a side effect of the changes required to cover :pep:`701` in
the :mod:`tokenize` module. (Contributed by Marta Gómez Macías and Pablo Galindo
in :gh:`102856`.)


CPython bytecode changes
========================
Expand Down Expand Up @@ -1130,6 +1213,8 @@ Removed
Iceape, Firebird, and Firefox versions 35 and below (:gh:`102871`).


.. _whatsnew312-porting-to-python312:

Porting to Python 3.12
======================

Expand Down Expand Up @@ -1201,6 +1286,40 @@ Changes in the Python API
that may be surprising or dangerous.
See :ref:`tarfile-extraction-filter` for details.

* The output of the :func:`tokenize.tokenize` and :func:`tokenize.generate_tokens`
functions is now changed due to the changes introduced in :pep:`701`. This
means that ``STRING`` tokens are not emitted any more for f-strings and the
tokens described in :pep:`701` are now produced instead: ``FSTRING_START``,
``FSRING_MIDDLE`` and ``FSTRING_END`` are now emitted for f-string "string"
parts in addition to the appropriate tokens for the tokenization in the
expression components. For example for the f-string ``f"start {1+1} end"``
the old version of the tokenizer emitted::

1,0-1,18: STRING 'f"start {1+1} end"'

while the new version emits::

1,0-1,2: FSTRING_START 'f"'
1,2-1,8: FSTRING_MIDDLE 'start '
1,8-1,9: OP '{'
1,9-1,10: NUMBER '1'
1,10-1,11: OP '+'
1,11-1,12: NUMBER '1'
1,12-1,13: OP '}'
1,13-1,17: FSTRING_MIDDLE ' end'
1,17-1,18: FSTRING_END '"'

Aditionally, there may be some minor behavioral changes as a consecuence of the
changes required to support :pep:`701`. Some of these changes include:

* Some final ``DEDENT`` tokens are now emitted within the bounds of the
input. This means that for a file containing 3 lines, the old version of the
tokenizer returned a ``DEDENT`` token in line 4 whilst the new version returns
the token in line 3.

* The ``type`` attribute of the tokens emitted when tokenizing some invalid Python
characters such as ``!`` has changed from ``ERRORTOKEN`` to ``OP``.

Build Changes
=============

Expand Down

0 comments on commit c45701e

Please sign in to comment.