Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(vitest): decouple fake/real time in waitFor()'s check #6802

Closed

Conversation

samavos
Copy link

@samavos samavos commented Oct 28, 2024

When running with fake timers, and when polling for promise to complete, make sure we do the poll in a short real time interval, independent of the fake time interval.

This makes it possible to use long time intervals in fake time without needing to wait for the corresponding amount of real time passing. A unit test is added to illustrate this behaviour: with the previous code, this unit test would have taken over a minute to complete, now it is near instantaneous.

When running with fake timers, and when polling for promise to complete,
make sure we do the poll in a short *real* time interval, independent of
the fake time interval.

This makes it possible to use long time intervals in fake time without
needing to wait for the corresponding amount of real time passing. A
unit test is added to illustrate this behaviour: with the previous code,
this unit test would have taken over a minute to complete, now it is
near instantaneous.
@samavos samavos changed the title Decouple fake/real time in waitFor()'s check fix(vitest): Decouple fake/real time in waitFor()'s check Oct 28, 2024
@samavos samavos changed the title fix(vitest): Decouple fake/real time in waitFor()'s check fix(vitest): decouple fake/real time in waitFor()'s check Oct 28, 2024
Copy link

netlify bot commented Oct 28, 2024

Deploy Preview for vitest-dev ready!

Built without sensitive environment variables

Name Link
🔨 Latest commit 77428e8
🔍 Latest deploy log https://app.netlify.com/sites/vitest-dev/deploys/671f8dba8830fb0008807b63
😎 Deploy Preview https://deploy-preview-6802--vitest-dev.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@sheremet-va
Copy link
Member

Please, don't drop the PR template:

Please don't delete this checklist! Before submitting the PR, please make sure you do the following:

  • It's really useful if your PR references an issue where it is discussed ahead of time. If the feature is substantial or introduces breaking changes without a discussion, PR might be closed.
  • Ideally, include a test that fails without this PR but passes with it.
  • Please, don't make changes to pnpm-lock.yaml unless you introduce a new test example.

The first point is especially important as we can't evaluate the usefulness of this change without examples and a proper discussion.

@samavos
Copy link
Author

samavos commented Oct 28, 2024

Please, don't drop the PR template:

Apologies, I created the PR with gh from the command-line, which didn't include the template.

I was about to raise an issue, but thought the issue was very clear from a unit test, which I've included. If it's helpful to raise a issue and move the discussion there I'm happy to do that. But as it stands, there is a unit test and description here which would be the information I'd put in the issue.

Would you like to consider discuss here, or shall I move this to an issue?

@sheremet-va
Copy link
Member

sheremet-va commented Oct 28, 2024

Would you like to consider discuss here, or shall I move this to an issue?

You can start here.

I was about to raise an issue, but thought the issue was very clear from a unit test, which I've included.

The reasoning doesn't really make it clear to me. The test is very artificial. The interval intentionally calls the real interval here. It is an assertion code, not the test code and shouldn't be affected by the fake timers. The fact that some timers are mocked doesn't imply that we can speed up the timer that was passed down by the user.

@samavos
Copy link
Author

samavos commented Oct 28, 2024

I was about to raise an issue, but thought the issue was very clear from a unit test, which I've included.

The reasoning doesn't really make it clear to me. The test is very artificial. The interval intentionally calls the real interval here. It is an assertion code, not the test code and shouldn't be affected by the fake timers. The fact that some timers are mocked doesn't imply that we can speed up the timer that was passed down by the user.

The test is artificial, but it's based on a real use-case in my project. I've just simplified it. The current behaviour causes real problems in a (closed) project I'm working on.

I believe the purpose of the interval parameter is to ensure that time moves forward by that interval for every check that waitFor runs. The problem with this definition is that it is ambiguous: does one mean real time or fake time? My personal definition was purely "whatever mode vitest timers are configured to". Given such a definition, when using fake timers, the real-time interval used is purely an implementation detail.

I'd be interested in knowing what your definition of the interval parameter is?

A way to change this to not introduce any change in behaviour is to add another configuration option for this case; but that of course increases the size of the API. I'd be equally happy with such an approach in my use-case.

@sheremet-va
Copy link
Member

sheremet-va commented Oct 28, 2024

I'd be interested in knowing what your definition of the interval parameter is?

interval and timeout are real intervals and timeout. interval is the time when the callback should be called again if it fails.

A way to change this to not introduce any change in behaviour is to add another configuration option for this case; but that of course increases the size of the API. I'd be equally happy with such an approach in my use-case.

You can already provide your own interval and check for vi.isFakeTimers(). Note that it will return true if vi.useFakeTimers was called, but it's possible to provide your own toFake array without mocking interval or timeout.

@samavos
Copy link
Author

samavos commented Oct 28, 2024

I'd be interested in knowing what your definition of the interval parameter is?

interval and timeout are real intervals and timeout. interval is the time when the callback should be called again if it fails.

OK, fair enough.

A way to change this to not introduce any change in behaviour is to add another configuration option for this case; but that of course increases the size of the API. I'd be equally happy with such an approach in my use-case.

You can already provide your own interval and check for vi.isFakeTimers(). Note that it will return true if vi.useFakeTimers was called, but it's possible to provide your own toFake array without mocking interval or timeout.

I'm not entirely sure I follow that suggestion. In my case, I have the following scenario:

  • A module that uses timers (Date.now(), setTimeout at least), so we want to use fake timers
  • A module that can have fairly long wait times on setTimeout (e.g. 1 minute long) or setInterval
  • Unit tests for the above, where we want to provoke and test behaviours that result in quite a lot of (fake) time needing to pass

I believe the right answer here is: vi.useFakeTimers(), and in many cases calling advanceTimersByTime() manually does what I need.

In the few cases I'd like to use waitFor(), there isn't any reasonable interval to set: I'd need to set it to be very high (e.g. 30 seconds), but now my unit test actually needs to take >= 30 seconds, rather than the milliseconds it could/should take.

Are you suggesting there is a way to use the API to solve for this case today without any modifications? I haven't quite been able to figure out what that is based on your last comment, sorry.

@sheremet-va
Copy link
Member

In the few cases I'd like to use waitFor(), there isn't any reasonable interval to set: I'd need to set it to be very high (e.g. 30 seconds), but now my unit test actually needs to take >= 30 seconds, rather than the milliseconds it could/should take.

Why do you even need such a big interval? It means the callback will be called once every 30 seconds. Don't you need a timeout?

Are you suggesting there is a way to use the API to solve for this case today without any modifications? I haven't quite been able to figure out what that is based on your last comment, sorry.

Yeah, what you did in this PR looks like this in user code:

vi.waitFor(() => {}, {
  interval: vi.isFakeTimers() ? 40 : 30000
})

@samavos
Copy link
Author

samavos commented Oct 28, 2024

In the few cases I'd like to use waitFor(), there isn't any reasonable interval to set: I'd need to set it to be very high (e.g. 30 seconds), but now my unit test actually needs to take >= 30 seconds, rather than the milliseconds it could/should take.

Why do you even need such a big interval? It means the callback will be called once every 30 seconds. Don't you need a timeout?

Because we're testing code that really does, in the real world, use long intervals. One case that is hopefully easy to describe: (something using) an exponential backoff timer. Consider something with a backoff that quickly results in the backoff time being a long duration after some error conditions are hit. It is reasonable to check that something will succeed after some amount of backing off.

Are you suggesting there is a way to use the API to solve for this case today without any modifications? I haven't quite been able to figure out what that is based on your last comment, sorry.

Yeah, what you did in this PR looks like this in user code:

vi.waitFor(() => {}, {
  interval: vi.isFakeTimers() ? 40 : 30000
})

Sorry, I don't agree. Are we perhaps talking at cross purposes? There are two separate things here: the amount of fake time that is increased per time we check, and the amount of real time that needs to pass. Consider the unit test code here:

    vi.useFakeTimers()
    await vi.waitFor(() => {
      return new Promise<void>((resolve) => {
        setTimeout(() => {
          resolve()
        }, 60000)
      })
    }, { interval: 30000 })

Your suggestion is that we change the interval there to read:

// ... as above
    }, { interval: vi.isFakeTimers() ? 40 : 30000 })

In this trivialised case, this will obviously set the interval to 40.

  • What will then happen (prior to my change) is the test will need to wait for a total of 60 seconds, checking the state of what is being waited for every 40ms. All configuring the interval to do is changes the number of times the condition is checked and timers advanced by the interval, not the amount of real time that passes.
  • What I am trying to achieve is having the test not wait the full 60 seconds. My change is one approach that achieves this.

I had a look at dom-testing-library and it doesn't have the same behaviour as vitest does. It's actually even more extreme than my version: it doesn't wait for any real time to advance, just relies on doing an await running other pending microtasks. Their initial implementation for fake-timers is here and the current version here. I'm not sure if you would consider it desirable to have the same semantics as that?

@sheremet-va sheremet-va added the p2-to-be-discussed Enhancement under consideration (priority) label Oct 31, 2024
@sheremet-va
Copy link
Member

We talked within the team about this issue. We still don't understand the use case. Can you give an actual code example that helps understand this? And can you explain why you need to use vi.waitFor there?

@samavos
Copy link
Author

samavos commented Nov 15, 2024

Sorry for the slow reply, it required a bit of concentration to pull something cohesive together. Here is a full test file showing one real-ish use-case, and I'll add some commentary after.

code

import { describe, it, expect, vi, beforeEach } from 'vitest'
import { waitFor as waitForHack } from './waitForHack'

class Backoff {
    constructor(private errors = 0, private lastError = 0) {}

    finish(ok: boolean) {
        if (ok) {
            this.errors = 0;
        } else {
            this.errors++;
            this.lastError = Date.now();
        }
    }

    permitted() {
        return Date.now() > this.getDelayUntil();
    }

    getDelayUntil() {
        if (this.errors === 0) return 0;
        return this.lastError + Math.max(200, Math.pow(2, this.errors - 1) * 500);
    }
}

async function delay(ms: number, signal?: AbortSignal): Promise<void> {
    return new Promise((resolve, reject) => {
        const timeout = setTimeout(resolve, ms);
        if (signal) {
            signal.onabort = () => {
                clearTimeout(timeout);
                reject(new Error("Aborted"));
            }
        }
    });
}

async function retryingFetch(
    ...args: Parameters<typeof fetch>
): Promise<any> {
    const backoffManager = new Backoff();
    let haveFailed = false;
    const startTime = Date.now();

    while (true) {
        try {
            const response = await fetch(...args);

            if (response.ok) {
                return response;
            }

            haveFailed = true;

            if (response.status >= 400 && response.status < 500) {
                return response;
            }
        }
        catch (e) {
            if (e instanceof Error && e.name === "AbortError") {
                throw e;
            }
        }

        backoffManager.finish(false);

        if (!backoffManager.permitted()) {
            const next = backoffManager.getDelayUntil();
            // console.log(`${Date.now()}: Waiting ${next - Date.now()}ms before retrying`);
            await delay(next - Date.now(), args[1]?.signal || undefined);
        }
    }
}


describe("waitFor experiments", () => {
    beforeEach(() => {
        vi.useFakeTimers();
        return () => {
            vi.useRealTimers();
        }
    });

    it.each([
        ["original waitFor", vi.waitFor],
        ["hacky waitFor", waitForHack],
    ])("should wait for one error: %s", async (_name, waitFor) => {
        vi.spyOn(globalThis, 'fetch')
            .mockImplementationOnce(() => { throw new Error('Network error'); })
            .mockImplementationOnce(() => { return Promise.resolve(new Response(null, { status: 200 })); });

        const response = await waitFor(async () => {
            return await retryingFetch('https://example.com');
        })

        expect(response.ok).toBe(true);
    });

    it.each([
        ["original waitFor", vi.waitFor],
        ["hacky waitFor", waitForHack],
    ])("should wait for 4 errors: %s", async (_name, waitFor) => {
        const networkError = () => { throw new Error('Network error'); };
        let requests = 0;

        vi.spyOn(globalThis, 'fetch')
            .mockImplementation(() => {
                requests++;
                if (requests < 4) networkError();
                return Promise.resolve(new Response(null, { status: 200 }));
            });

        const response = await waitFor(async () => {
            return await retryingFetch('https://example.com');
        }, {
            interval: 1000,
        })

        expect(response.ok).toBe(true);
    });

    it("should wait for 4 errors: manual version", async () => {
        const networkError = () => { throw new Error('Network error'); };
        let requests = 0;

        vi.spyOn(globalThis, 'fetch')
            .mockImplementation(() => {
                requests++;
                if (requests < 4) networkError();
                return Promise.resolve(new Response(null, { status: 200 }));
            });

        let finished = false;
        let response: Response | undefined;

        retryingFetch('https://example.com')
            .then(resp => {
                response = resp;
                finished = true;
            });

        while (!finished) {
            await vi.advanceTimersByTime(1000);
        }

        expect(response!.ok).toBe(true);
    });
})

The file waitForHack.ts is purely the waitFor implementation in this repository with the change on this PR applied.

running the code

$ vitest run

 ❯ main.test.ts (5) 1826ms
   ❯ waitFor experiments (5) 1826ms
     ✓ should wait for one error: original waitFor 531ms
     ✓ should wait for one error: hacky waitFor
     × should wait for 4 errors: original waitFor 1009ms
     ✓ should wait for 4 errors: hacky waitFor
     ✓ should wait for 4 errors: manual version

scenario description

Here I've thrown together a hopefully illustrative example of a wrapper around fetch that applies an exponential backoff. It doesn't cover all the cases a production version might need to, but hopefully shows a real-life scenario well enough.

The exact thing being fetched in this example is trivial, but in a more real-life example it might be API requests buried between various layers of code. That is what we have in one of our private repos.

Tests want to assert that API requests eventually succeed, even in the case of errors seen. This allows us to assert something like "yes, this action really is configured in the right way such that it handles network errors reasonably" when thinking about how some module is invoked in a test scenario.

commentary

intro

I think the waitFor API is a bit confusing because it really supports two different use-cases:

  1. Retrying something that fails; often in testing it is an expect(...) call
  2. Waiting for an asynchronous operation to complete, ticking fake timers as appropriate

In this scenario we're only looking at (2) above.

main

My view of the output and code is that we can see the following things:

  1. Those using the current version of waitFor take a long time to execute (or time out), even though I've opted into-fake timers, which I consider confusing given that opt-in
  2. With the proposed version of waitFor, the time to execute is ~instantaneous
  3. Not using waitFor, and implementing own our wrapper, is possible but awkward

For (3), I tried to show this in "should wait for 4 errors: manual version". This version isn't complete -- it doesn't handle promise rejections of retryingFetch, so a little more complexity is required to get a general correct version.

In essence, my argument here pans out to the following:

  1. I've opted into fake timers, so I expect my tests to run off virtualised time, not real-time
  2. I've called a test API that guarantees to advance the (fake, in my case) clock
  3. I'm surprised by an API that works with fake timers having any dependency on the real clock (I'll concede overall timeout is a caveat where I'd expect the real clock to be used)

addendum

I completely understand if this isn't a scenario you want to support; if so let's close down this PR. Apologies again for not raising it as a discussion topic rather than a PR to start with.

From our perspective in our workplace, we've ended up having our own waitFor implementation that works like this. We'd much rather use a general version, and contribute upstream, if possible!

@sheremet-va
Copy link
Member

sheremet-va commented Nov 21, 2024

Thank you for the detailed explanation of your use case.

Waiting for an asynchronous operation to complete, ticking fake timers as appropriate

This doesn't seem correct. If you need to await an asynchronous operation, just use await. "fake timers" are ticking because we use the real times there - the waitFor method is not meant to be used because timers are ticking, this is the consequence of how it works - it wouldn't make sense to not tick time when the actual time passes.

From your example, it seems like retryingFetch actually never fails unless you abort it, so using waitFor in this context doesn't make any sense to me - it's only meant to be used for flaky callbacks.

Overall, the team doesn't think this is a valid use case for waitFor.

@samavos
Copy link
Author

samavos commented Nov 21, 2024

Ok, thanks for taking the time to consider and explaining the rationale @sheremet-va .

For what it's worth, the guidance "just use await" does not apply here, because that would just deadlock/time out -- the point of my example was explicitly a case of an asynchronous process that needs time to pass to succeed (fake or otherwise). But I understand completely not wanting to overload the waitFor API with this separate use-case/behaviour.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
p2-to-be-discussed Enhancement under consideration (priority)
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

2 participants