Skip to content

Commit

Permalink
Merge python#23
Browse files Browse the repository at this point in the history
23: warn for tokenize() r=ltratt a=nanjekyejoannah

The constructor for `tokenize()` has changed in Python 3, to behave like the `tokenize.generate_token(`) function on Python 2.

Co-authored-by: Joannah Nanjekye <jnanjeky@unb.ca>
  • Loading branch information
bors[bot] and nanjekyejoannah authored Feb 16, 2023
2 parents 1f12b38 + ab79d5c commit f28eb86
Show file tree
Hide file tree
Showing 2 changed files with 14 additions and 0 deletions.
11 changes: 11 additions & 0 deletions Lib/test/test_py3kwarn.py
Original file line number Diff line number Diff line change
Expand Up @@ -331,6 +331,17 @@ def test_file_open(self):
with check_py3k_warnings() as w:
self.assertWarning(f.read(), w, expected)

def test_tokenize(self):
import tokenize
import io
expected = "tokenize() changed in 3.x: use generate_tokens() instead."
def helper_tok():
for tok in tokenize.tokenize(io.BytesIO('1 + 2').readline):
print tok
with check_py3k_warnings() as w:
self.assertWarning(helper_tok(), w, expected)


def test_file(self):
expected = ("The builtin 'file()'/'open()' function is not supported in 3.x, "
"use the 'io.open()' function instead with the encoding keyword argument")
Expand Down
3 changes: 3 additions & 0 deletions Lib/tokenize.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@
from itertools import chain
import string, re
from token import *
import warnings

import token
__all__ = [x for x in dir(token) if not x.startswith("_")]
Expand Down Expand Up @@ -166,6 +167,8 @@ def tokenize(readline, tokeneater=printtoken):
called once for each token, with five arguments, corresponding to the
tuples generated by generate_tokens().
"""
warnings.warnpy3k_with_fix("tokenize() changed in 3.x", "use generate_tokens() instead.",
stacklevel=2)
try:
tokenize_loop(readline, tokeneater)
except StopTokenizing:
Expand Down

0 comments on commit f28eb86

Please sign in to comment.