Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix incorrect calculation of number of failed tests in stress-test.ipynb #82

Conversation

tlvu
Copy link
Contributor

@tlvu tlvu commented Jul 15, 2021

Overview

Fix stress-test.ipynb incorrectly throw error when there are no errors!

Changes

tlvu added 3 commits July 15, 2021 14:06
Below is the result from a nightly run on production.  It found 4 failed tests
but in fact none were failing since failed_results was empty.

```
  =================================== FAILURES ===================================
  _____________________ notebooks/stress-tests.ipynb::Cell 2 _____________________
  Notebook cell execution failed
  Cell 2: Cell execution caused an exception

  Input:
  # NBVAL_IGNORE_OUTPUT

  test_statuses = []
  failed_results = ''
  for bird in TEST_WPS_BIRDS:
      bird_url = f"{TWITCHER_URL}/{bird}/wps?service=wps&request=getcapabilities"
      expect_status_code = 200
      results = stress_test_requests(bird_url, runs=TEST_RUNS, code=expect_status_code,
                                     max_err_code=TEST_MAX_ERR_CODE, max_avg_time=TEST_MAX_AVG_TIME,
                                     abort_retries=TEST_TIMEOUT_RETRY, abort_timeout=TEST_TIMEOUT_ABORT)
      test_statuses.append(results.status)
      print(results)
      if results.status:
          failed_results = f"{failed_results}\n{results}"
  failed_tests = len([status != 0 for status in test_statuses])
  assert not failed_tests, f"Failed {failed_tests} tests.  Failed results: {failed_results}"

  print("\nAll tests passed!")

  Traceback:

  ---------------------------------------------------------------------------
  AssertionError                            Traceback (most recent call last)
  <ipython-input-3-de267d95df0f> in <module>
       15         failed_results = f"{failed_results}\n{results}"
       16 failed_tests = len([status != 0 for status in test_statuses])
  ---> 17 assert not failed_tests, f"Failed {failed_tests} tests.  Failed results: {failed_results}"
       18
       19 print("\nAll tests passed!")

  AssertionError: Failed 4 tests.  Failed results:

  =========================== short test summary info ============================
  FAILED notebooks/stress-tests.ipynb::Cell 2
  ============= 1 failed, 174 passed, 1 skipped in 428.88s (0:07:08) =============
```
Hopefully fix

```
  ============================= test session starts ==============================
  platform linux -- Python 3.7.10, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
  rootdir: /home/jenkins/agent/workspace/ailed-tests-in-stress-test.ipynb
  plugins: anyio-3.1.0, dash-1.20.0, nbval-0.9.6, tornasync-0.6.0.post2
  collected 0 items
  INTERNALERROR> Traceback (most recent call last):
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/_pytest/main.py", line 269, in wrap_session
  INTERNALERROR>     session.exitstatus = doit(config, session) or 0
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/_pytest/main.py", line 322, in _main
  INTERNALERROR>     config.hook.pytest_collection(session=session)
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/hooks.py", line 286, in __call__
  INTERNALERROR>     return self._hookexec(self, self.get_hookimpls(), kwargs)
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/manager.py", line 93, in _hookexec
  INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/manager.py", line 87, in <lambda>
  INTERNALERROR>     firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/callers.py", line 208, in _multicall
  INTERNALERROR>     return outcome.get_result()
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/callers.py", line 80, in get_result
  INTERNALERROR>     raise ex[1].with_traceback(ex[2])
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/callers.py", line 187, in _multicall
  INTERNALERROR>     res = hook_impl.function(*args)
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/_pytest/main.py", line 333, in pytest_collection
  INTERNALERROR>     session.perform_collect()
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/_pytest/main.py", line 620, in perform_collect
  INTERNALERROR>     rep = collect_one_node(self)
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/_pytest/runner.py", line 457, in collect_one_node
  INTERNALERROR>     ihook.pytest_collectstart(collector=collector)
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/hooks.py", line 286, in __call__
  INTERNALERROR>     return self._hookexec(self, self.get_hookimpls(), kwargs)
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/manager.py", line 93, in _hookexec
  INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/manager.py", line 87, in <lambda>
  INTERNALERROR>     firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/callers.py", line 208, in _multicall
  INTERNALERROR>     return outcome.get_result()
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/callers.py", line 80, in get_result
  INTERNALERROR>     raise ex[1].with_traceback(ex[2])
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/callers.py", line 187, in _multicall
  INTERNALERROR>     res = hook_impl.function(*args)
  INTERNALERROR>   File "/home/jenkins/agent/workspace/ailed-tests-in-stress-test.ipynb/conftest.py", line 3, in pytest_collectstart
  INTERNALERROR>     collector.skip_compare += 'text/html', 'application/javascript',
  INTERNALERROR> AttributeError: 'Session' object has no attribute 'skip_compare'

  ============================ no tests ran in 0.01s =============================
```
…s no attribute 'skip_compare'

```
  ============================= test session starts ==============================
  platform linux -- Python 3.7.10, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
  rootdir: /home/jenkins/agent/workspace/ailed-tests-in-stress-test.ipynb
  plugins: anyio-3.1.0, dash-1.20.0, nbval-0.9.6, tornasync-0.6.0.post2
  collected 0 items
  INTERNALERROR> Traceback (most recent call last):
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/_pytest/main.py", line 269, in wrap_session
  INTERNALERROR>     session.exitstatus = doit(config, session) or 0
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/_pytest/main.py", line 322, in _main
  INTERNALERROR>     config.hook.pytest_collection(session=session)
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/hooks.py", line 286, in __call__
  INTERNALERROR>     return self._hookexec(self, self.get_hookimpls(), kwargs)
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/manager.py", line 93, in _hookexec
  INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/manager.py", line 87, in <lambda>
  INTERNALERROR>     firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/callers.py", line 208, in _multicall
  INTERNALERROR>     return outcome.get_result()
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/callers.py", line 80, in get_result
  INTERNALERROR>     raise ex[1].with_traceback(ex[2])
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/callers.py", line 187, in _multicall
  INTERNALERROR>     res = hook_impl.function(*args)
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/_pytest/main.py", line 333, in pytest_collection
  INTERNALERROR>     session.perform_collect()
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/_pytest/main.py", line 620, in perform_collect
  INTERNALERROR>     rep = collect_one_node(self)
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/_pytest/runner.py", line 457, in collect_one_node
  INTERNALERROR>     ihook.pytest_collectstart(collector=collector)
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/hooks.py", line 286, in __call__
  INTERNALERROR>     return self._hookexec(self, self.get_hookimpls(), kwargs)
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/manager.py", line 93, in _hookexec
  INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/manager.py", line 87, in <lambda>
  INTERNALERROR>     firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/callers.py", line 208, in _multicall
  INTERNALERROR>     return outcome.get_result()
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/callers.py", line 80, in get_result
  INTERNALERROR>     raise ex[1].with_traceback(ex[2])
  INTERNALERROR>   File "/opt/conda/envs/birdy/lib/python3.7/site-packages/pluggy/callers.py", line 187, in _multicall
  INTERNALERROR>     res = hook_impl.function(*args)
  INTERNALERROR>   File "/home/jenkins/agent/workspace/ailed-tests-in-stress-test.ipynb/conftest.py", line 3, in pytest_collectstart
  INTERNALERROR>     collector.skip_compare += 'text/html', 'application/javascript',
  INTERNALERROR> AttributeError: 'Session' object has no attribute 'skip_compare'

  ============================ no tests ran in 0.01s =============================
```
@tlvu tlvu requested a review from fmigneault July 15, 2021 19:21
@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

Copy link
Contributor

@fmigneault fmigneault left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

all good

@tlvu tlvu merged commit ccac0b4 into master Jul 16, 2021
@tlvu tlvu deleted the fix-incorrect-calculation-of-number-of-failed-tests-in-stress-test-ipynb branch July 16, 2021 00:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants