Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flaky failures in test_concurrent_lazy_init #50

Closed
bcmills opened this issue Oct 4, 2024 · 0 comments · Fixed by #51
Closed

Flaky failures in test_concurrent_lazy_init #50

bcmills opened this issue Oct 4, 2024 · 0 comments · Fixed by #51
Assignees

Comments

@bcmills
Copy link
Contributor

bcmills commented Oct 4, 2024

make test sometimes fails at HEAD on my laptop:

bryan@bryan-macbook-pro:~/src/minject$ git status
On branch master
Your branch is up to date with 'origin/master'.

nothing to commit, working tree clean

bryan@bryan-macbook-pro:~/src/minject$ git rev-parse HEAD
1439db9c886275f02b971573192c515e0c39881a

bryan@bryan-macbook-pro:~/src/minject$ make test
.venv/bin/python3 -m hatch run python docs/examples/philosophy.py
.venv/bin/python3 -m hatch run python -m unittest discover -s tests
philosophy example tests passed!
.....F..................
======================================================================
FAIL: test_concurrent_lazy_init (test_registry.RegistryTestCase.test_concurrent_lazy_init)
Test lazy initialization of singletons in a concurrent environment always returns the same object
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/bryan/src/minject/tests/test_registry.py", line 481, in test_concurrent_lazy_init
    assert all(count == query_per_class for count in counter.values())
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError

----------------------------------------------------------------------
Ran 24 tests in 0.034s

FAILED (failures=1)
make: *** [test-unit] Error 1

(CC @mmchenry-duolingo @biniona)

bcmills added a commit that referenced this issue Oct 4, 2024
It turns out that functools.lru_cache is only mostly thread-safe. It
guarantees that the underlying data structure will remain coherent
during concurrent updates, but does NOT guarantee that the wrapped
function will be called at most once per key. That could lead to calls
to new_type unexpectedly returning different types for the same index.

Fixes #50.
@bcmills bcmills self-assigned this Oct 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging a pull request may close this issue.

1 participant