Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question: Does undici.fetch support RFC 7234? #1146

Closed
loynoir opened this issue Dec 14, 2021 · 13 comments · Fixed by #3562
Closed

Question: Does undici.fetch support RFC 7234? #1146

loynoir opened this issue Dec 14, 2021 · 13 comments · Fixed by #3562
Labels
enhancement New feature or request

Comments

@loynoir
Copy link

loynoir commented Dec 14, 2021

This would solve...

The implementation should look like...

https://www.npmjs.com/package/http-cache-semantics

RFC 7234
RFC 5861

I have also considered...

Additional context

@loynoir loynoir added the enhancement New feature or request label Dec 14, 2021
@loynoir loynoir changed the title Question: Does undici support RFC 7234? Question: Does undici.fetch support RFC 7234? Dec 14, 2021
@mcollina
Copy link
Member

Not at this point. I think it would be great to support it - I would keep it optional as it'd be unclear where to store the cache.

@loynoir
Copy link
Author

loynoir commented Dec 14, 2021

https://github.com/sindresorhus/got/blob/HEAD/documentation/cache.md

const storageAdapter = new Map();
await got('https://sindresorhus.com', {cache: storageAdapter});

Maybe something like this

const fetch2 = undici.sudoCreateFetch({
  // all super user options, outside whatwg fetch
  cache: storageAdapter
})

@MoLow
Copy link
Member

MoLow commented Feb 18, 2022

https://github.com/sindresorhus/got/blob/HEAD/documentation/cache.md

const storageAdapter = new Map();
await got('https://sindresorhus.com', {cache: storageAdapter});

Maybe something like this

const fetch2 = undici.sudoCreateFetch({
  // all super user options, outside whatwg fetch
  cache: storageAdapter
})

I think if any caching layer is implemented, it should support an asynchronous API - it makes more sense to me

@mcollina
Copy link
Member

cc @jasnell

@arontsang
Copy link
Contributor

It would make sense for any caching layer to be implemented as a plugin system.
I noticed that the RedirectHandler is the closest thing to a "plugin" in undici, and in fact caching could be done as a dispatch interceptor.

The interceptor receives the Request Opts, and the DispatcherHandler, and simply calls the events on the DispatcherHandler directly to fulfill the request.

@ci010
Copy link

ci010 commented Sep 18, 2022

After the #1338 is checked in, I've try to integrate the npm package with custom dispatcher/handler to implement cache.

Currently, I think we still have a gap to implement cache gracefully.

I want to hook the cache checking opration right before the dispatch function (by using DispatchInterceptor[]):

(pseudo code)

function createCacheInterceptor(cache: CacheStorage) {
  return (dispatch) => (opts, handler) => {
    // check cache
    const cachedRequest = cache.get(serializeOptions(opts)) // we cannot have promise here... but normally get cache is an async op
    if (!cachedRequest || cachedRequest.isExpire()) {
      // no cache or expired cache, we should dispatch request
      return dispatch(opts, new CacheHandler(handler, cache))
    }
    // otherwise, we should mock the request result to the handler
    handler.onHeaders(cachedRequest.getHeader())
    handler.onData(cachedRequest.getData())
    handler.onComplete(cachedRequest.getTrailer())
    return false
  }
}

In the current design, the Dispatcher.dispatch must be sync, returing a boolean, but normally, retrieving cache is an async operation:

interface CacheStorage {
  get(key: string): Promise<any>
  set(key: string, val: any): Promise<void>
}

I wonder if there is any chance to make dispatch async?
Or is there any other design to implement this?

@mcollina
Copy link
Member

The response type of dispatch is correct: no error or promises should be propagated.

@ci010
Copy link

ci010 commented Sep 18, 2022

The response type of dispatch is correct: no error or promises should be propagated.

Yes, that's what I mean. Currently, we can only have a boolean return type for dispatch.

It's not possible to have async cache read ops in dispatch function unless we support Promise<boolean> for dispatch.

@arontsang
Copy link
Contributor

arontsang commented Sep 18, 2022 via email

@ci010
Copy link

ci010 commented Sep 18, 2022

That isn't true. dispatch is async. It just isn't async over Promise. dispatch uses old school callbacks.

On Sun, 18 Sept 2022 at 23:36, CI010 @.> wrote: The response type of dispatch is correct: no error or promises should be propagated. Yes, that's what I mean. Currently, we can only have a boolean return type for dispatch. It's not possible to have async cache read ops in dispatch function unless we support Promise for dispatch. — Reply to this email directly, view it on GitHub <#1146 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA46MLM2VFQ4PV3MKAXOKO3V64ZJNANCNFSM5KBLTJVQ . You are receiving this because you commented.Message ID: @.>

I agree with "dispatch is async with old school callback", but the return value of dispatch (the boolean) is used to determine if the Pool or Client needDrain, which mean I cannot just return value without cache checking:

function createCacheInterceptor(cache: CacheStorage) {
  return (dispatch) => (opts, handler) => {
    cache.getCache(serializeOptions(opts)).then((cachedRequest) => {
      if (!cachedRequest || cachedRequest.isExpire()) {
        this.dispatcher.dispatch(opts, handler) // ignore drain status...
      } else {
        handler.onHeaders(cachedRequest.getHeader())
        handler.onData(cachedRequest.getData())
        handler.onComplete(cachedRequest.getTrailer())
      }
    }, (error) => {
      // handle error
    })
    
    // return false anyway... this will break the Pool and Agent, which relay on the return value
    return false
  }
}

Suppose we can wrap a customized Dispatcher, we can have the best effort like:

class CustomDispatcher extends Dispatcher {
  private cache: CacheStorage
  private dispatcher: Dispatcher // wrap another dispatcher

  dispatch(opts, handler) {
    this.cache.getCache(serializeOptions(opts)).then((cachedRequest) => {
      if (!cachedRequest || cachedRequest.isExpire()) {
        this.dispatcher.dispatch(opts, handler) // ignore drain status...
      } else {
        handler.onHeaders(cachedRequest.getHeader())
        handler.onData(cachedRequest.getData())
        handler.onComplete(cachedRequest.getTrailer())
      }
    }, (error) => {
      // handle error
    })

    const needDrain = this.dispatcher[kNeedDrainClient]
    if (needDrain !== undefined) {
      // see Client implementation
      return needDrain < 2
    }

    const needDrainP = this.dispatcher[kNeedDrainPool] // the pool needDrain symbol is different from the Client one (core/symbols.js)
    if (needDrainP !== undefined) {
      // see PoolBase implementation
      return needDrainP
    }

    // return false anyway... this will break the Pool and Agent relay on the return value
    return false
  }
}

Still, this is solution is not ideal... it's relay on the Client and Pool implementation.

@mcollina
Copy link
Member

I don't think you'll need to return anything but false in the custom dispatcher:

class CustomDispatcher extends Dispatcher {
  private cache: CacheStorage
  private dispatcher: Dispatcher // wrap another dispatcher

  dispatch(opts, handler) {
    this.cache.getCache(serializeOptions(opts)).then((cachedRequest) => {
      if (!cachedRequest || cachedRequest.isExpire()) {
        this.dispatcher.dispatch(opts, handler) // ignore drain status...
      } else {
        handler.onHeaders(cachedRequest.getHeader())
        handler.onData(cachedRequest.getData())
        handler.onComplete(cachedRequest.getTrailer())
      }
    }, (error) => {
      // handle error
    })

    
    return false
  }
}

In order for this to work properly, the wrapped dispatcher must be a full Agent (as it handles the drain events correctly inside).


FWIW I think this shows a significant issue in the interceptor design cc @ronag @kibertoad.

@kibertoad
Copy link
Contributor

kibertoad commented Sep 18, 2022

@mcollina Should we mark custom dispatchers as experimental in docs prior to release?
Also I can spend some time on interceptors, if you have some ideas how to improve the situation

@mcollina
Copy link
Member

I think we should mark custom interceptors as experimental prior to release. Custom dispatchers have been implemented for a while, and I don't think we can break them in any form.

flakey5 added a commit to flakey5/undici that referenced this issue Sep 7, 2024
Implements bare-bones http caching as per rfc9111

Closes nodejs#3231
Closes nodejs#2760
Closes nodejs#2256
Closes nodejs#1146

Signed-off-by: flakey5 <73616808+flakey5@users.noreply.github.com>
flakey5 added a commit to flakey5/undici that referenced this issue Sep 7, 2024
Implements bare-bones http caching as per rfc9111

Closes nodejs#3231
Closes nodejs#2760
Closes nodejs#2256
Closes nodejs#1146

Signed-off-by: flakey5 <73616808+flakey5@users.noreply.github.com>
flakey5 added a commit to flakey5/undici that referenced this issue Sep 7, 2024
Implements bare-bones http caching as per rfc9111

Closes nodejs#3231
Closes nodejs#2760
Closes nodejs#2256
Closes nodejs#1146

Signed-off-by: flakey5 <73616808+flakey5@users.noreply.github.com>
@flakey5 flakey5 mentioned this issue Sep 7, 2024
7 tasks
flakey5 added a commit to flakey5/undici that referenced this issue Sep 11, 2024
Implements bare-bones http caching as per rfc9111

Closes nodejs#3231
Closes nodejs#2760
Closes nodejs#2256
Closes nodejs#1146

Signed-off-by: flakey5 <73616808+flakey5@users.noreply.github.com>

Co-authored-by: Carlos Fuentes <me@metcoder.dev>
Co-authored-by: Robert Nagy <ronagy@icloud.com>
flakey5 added a commit to flakey5/undici that referenced this issue Sep 12, 2024
Implements bare-bones http caching as per rfc9111

Closes nodejs#3231
Closes nodejs#2760
Closes nodejs#2256
Closes nodejs#1146

Signed-off-by: flakey5 <73616808+flakey5@users.noreply.github.com>

Co-authored-by: Carlos Fuentes <me@metcoder.dev>
Co-authored-by: Robert Nagy <ronagy@icloud.com>

Co-authored-by: Isak Törnros <isak.tornros@hotmail.com>
flakey5 added a commit to flakey5/undici that referenced this issue Sep 14, 2024
Implements bare-bones http caching as per rfc9111

Closes nodejs#3231
Closes nodejs#2760
Closes nodejs#2256
Closes nodejs#1146

Co-authored-by: Carlos Fuentes <me@metcoder.dev>

Co-authored-by: Robert Nagy <ronagy@icloud.com>

Co-authored-by: Isak Törnros <isak.tornros@hotmail.com>

Signed-off-by: flakey5 <73616808+flakey5@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants