Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix an erroneous logging format string and pylint pragma #1630

Merged
merged 8 commits into from
Aug 16, 2022

Conversation

bradlarsen
Copy link
Contributor

Description

This pull request fixes an incorrect logging format string and a pylint pragma that incorrectly disables its warning it.

I noticed this issue when using TorchServe's profiling mechanism via the ENABLE_TORCH_PROFILER=1 environment variable, which resulted in surprising non-fatal stack traces. For example:

Backend received inference at: 1652395807
--- Logging error ---
Traceback (most recent call last):
  File "/usr/lib/python3.8/logging/__init__.py", line 1081, in emit
    msg = self.format(record)
  File "/usr/lib/python3.8/logging/__init__.py", line 925, in format
    return fmt.format(record)
  File "/usr/lib/python3.8/logging/__init__.py", line 664, in format
    record.message = record.getMessage()
  File "/usr/lib/python3.8/logging/__init__.py", line 369, in getMessage
    msg = msg % self.args
TypeError: not all arguments converted during string formatting
Call stack:
  File "/home/venv/lib/python3.8/site-packages/ts/model_service_worker.py", line 189, in <module>
    worker.run_server()
  File "/home/venv/lib/python3.8/site-packages/ts/model_service_worker.py", line 161, in run_server
    self.handle_connection(cl_socket)
  File "/home/venv/lib/python3.8/site-packages/ts/model_service_worker.py", line 120, in handle_connection
    resp = service.predict(msg)
  File "/home/venv/lib/python3.8/site-packages/ts/service.py", line 102, in predict
    ret = self._entry_point(input_batch, self.context)
  File "/home/venv/lib/python3.8/site-packages/ts/torch_handler/request_envelope/base.py", line 28, in handle
    results = self._handle_fn(data, context)
  File "/home/venv/lib/python3.8/site-packages/ts/torch_handler/base_handler.py", line 214, in handle
    output, _ = self._infer_with_profiler(data=data)
  File "/home/venv/lib/python3.8/site-packages/ts/torch_handler/base_handler.py", line 259, in _infer_with_profiler
    logger.info("Saving chrome trace to : ", result_path) # pylint: disable=logging-too-many-args
Message: 'Saving chrome trace to : '
Arguments: ('/tmp/pytorch_profiler/finetuned-classifier',)

Type of change

  • Bug fix (non-breaking change which fixes an issue)

Feature/Issue validation/testing

It's a one-line change that can be understood in isolation.

@msaroufim msaroufim self-requested a review May 13, 2022 17:01
@codecov
Copy link

codecov bot commented Aug 11, 2022

Codecov Report

Merging #1630 (b00559d) into master (62b9f22) will not change coverage.
The diff coverage is n/a.

❗ Current head b00559d differs from pull request most recent head 4f6716b. Consider uploading reports for the commit 4f6716b to get more accurate results

@@           Coverage Diff           @@
##           master    #1630   +/-   ##
=======================================
  Coverage   45.28%   45.28%           
=======================================
  Files          64       64           
  Lines        2597     2597           
  Branches       60       60           
=======================================
  Hits         1176     1176           
  Misses       1421     1421           
Impacted Files Coverage Δ
ts/torch_handler/base_handler.py 0.00% <ø> (ø)

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@msaroufim msaroufim merged commit 55eff69 into pytorch:master Aug 16, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants