From 61d18d647322349b65e6bae676c816f640d14c80 Mon Sep 17 00:00:00 2001 From: "Node.js GitHub Bot" Date: Sat, 23 Sep 2023 12:40:45 +0100 Subject: [PATCH] deps: update undici to 5.24.0 PR-URL: https://github.com/nodejs/node/pull/49559 Reviewed-By: Matteo Collina Reviewed-By: Matthew Aitken --- deps/undici/src/README.md | 33 +- deps/undici/src/docs/api/Client.md | 14 +- deps/undici/src/docs/api/Connector.md | 4 +- deps/undici/src/docs/api/Dispatcher.md | 3 +- deps/undici/src/docs/api/MockPool.md | 3 +- deps/undici/src/index-fetch.js | 5 +- deps/undici/src/index.d.ts | 58 +- deps/undici/src/index.js | 5 +- deps/undici/src/lib/api/api-connect.js | 10 +- deps/undici/src/lib/api/api-request.js | 1 - deps/undici/src/lib/cache/cache.js | 6 +- deps/undici/src/lib/client.js | 571 +++++- deps/undici/src/lib/core/connect.js | 6 +- deps/undici/src/lib/core/request.js | 78 +- deps/undici/src/lib/core/symbols.js | 8 +- deps/undici/src/lib/core/util.js | 15 +- deps/undici/src/lib/fetch/body.js | 3 +- deps/undici/src/lib/fetch/index.js | 40 +- deps/undici/src/lib/fetch/response.js | 8 +- deps/undici/src/lib/fetch/util.js | 59 +- deps/undici/src/lib/pool.js | 4 +- deps/undici/src/lib/websocket/connection.js | 13 +- deps/undici/src/lib/websocket/frame.js | 11 +- deps/undici/src/lib/websocket/websocket.js | 37 +- deps/undici/src/package.json | 38 +- deps/undici/src/types/README.md | 6 + deps/undici/src/types/client.d.ts | 23 +- deps/undici/src/types/dispatcher.d.ts | 4 +- deps/undici/src/types/index.d.ts | 57 + deps/undici/src/types/package.json | 55 + deps/undici/undici.js | 1804 ++++++++++++++++- .../maintaining/maintaining-dependencies.md | 6 +- src/undici_version.h | 2 +- 33 files changed, 2679 insertions(+), 311 deletions(-) create mode 100644 deps/undici/src/types/README.md create mode 100644 deps/undici/src/types/index.d.ts create mode 100644 deps/undici/src/types/package.json diff --git a/deps/undici/src/README.md b/deps/undici/src/README.md index 05a5d21ed1195c..3ba89890df6f69 100644 --- a/deps/undici/src/README.md +++ b/deps/undici/src/README.md @@ -18,30 +18,34 @@ npm i undici ## Benchmarks The benchmark is a simple `hello world` [example](benchmarks/benchmark.js) using a -number of unix sockets (connections) with a pipelining depth of 10 running on Node 16. -The benchmarks below have the [simd](https://github.com/WebAssembly/simd) feature enabled. +number of unix sockets (connections) with a pipelining depth of 10 running on Node 20.6.0. ### Connections 1 + | Tests | Samples | Result | Tolerance | Difference with slowest | |---------------------|---------|---------------|-----------|-------------------------| -| http - no keepalive | 15 | 4.63 req/sec | ± 2.77 % | - | -| http - keepalive | 10 | 4.81 req/sec | ± 2.16 % | + 3.94 % | -| undici - stream | 25 | 62.22 req/sec | ± 2.67 % | + 1244.58 % | -| undici - dispatch | 15 | 64.33 req/sec | ± 2.47 % | + 1290.24 % | -| undici - request | 15 | 66.08 req/sec | ± 2.48 % | + 1327.88 % | -| undici - pipeline | 10 | 66.13 req/sec | ± 1.39 % | + 1329.08 % | +| http - no keepalive | 15 | 5.32 req/sec | ± 2.61 % | - | +| http - keepalive | 10 | 5.35 req/sec | ± 2.47 % | + 0.44 % | +| undici - fetch | 15 | 41.85 req/sec | ± 2.49 % | + 686.04 % | +| undici - pipeline | 40 | 50.36 req/sec | ± 2.77 % | + 845.92 % | +| undici - stream | 15 | 60.58 req/sec | ± 2.75 % | + 1037.72 % | +| undici - request | 10 | 61.19 req/sec | ± 2.60 % | + 1049.24 % | +| undici - dispatch | 20 | 64.84 req/sec | ± 2.81 % | + 1117.81 % | + ### Connections 50 | Tests | Samples | Result | Tolerance | Difference with slowest | |---------------------|---------|------------------|-----------|-------------------------| -| http - no keepalive | 50 | 3546.49 req/sec | ± 2.90 % | - | -| http - keepalive | 15 | 5692.67 req/sec | ± 2.48 % | + 60.52 % | -| undici - pipeline | 25 | 8478.71 req/sec | ± 2.62 % | + 139.07 % | -| undici - request | 20 | 9766.66 req/sec | ± 2.79 % | + 175.39 % | -| undici - stream | 15 | 10109.74 req/sec | ± 2.94 % | + 185.06 % | -| undici - dispatch | 25 | 10949.73 req/sec | ± 2.54 % | + 208.75 % | +| undici - fetch | 30 | 2107.19 req/sec | ± 2.69 % | - | +| http - no keepalive | 10 | 2698.90 req/sec | ± 2.68 % | + 28.08 % | +| http - keepalive | 10 | 4639.49 req/sec | ± 2.55 % | + 120.17 % | +| undici - pipeline | 40 | 6123.33 req/sec | ± 2.97 % | + 190.59 % | +| undici - stream | 50 | 9426.51 req/sec | ± 2.92 % | + 347.35 % | +| undici - request | 10 | 10162.88 req/sec | ± 2.13 % | + 382.29 % | +| undici - dispatch | 50 | 11191.11 req/sec | ± 2.98 % | + 431.09 % | + ## Quick Start @@ -432,6 +436,7 @@ and `undici.Agent`) which will enable the family autoselection algorithm when es * [__Ethan Arrowood__](https://github.com/ethan-arrowood), * [__Matteo Collina__](https://github.com/mcollina), * [__Robert Nagy__](https://github.com/ronag), +* [__Matthew Aitken__](https://github.com/KhafraDev), ## License diff --git a/deps/undici/src/docs/api/Client.md b/deps/undici/src/docs/api/Client.md index fc7c5d26e8f799..c0987713a328c5 100644 --- a/deps/undici/src/docs/api/Client.md +++ b/deps/undici/src/docs/api/Client.md @@ -17,11 +17,13 @@ Returns: `Client` ### Parameter: `ClientOptions` +> ⚠️ Warning: The `H2` support is experimental. + * **bodyTimeout** `number | null` (optional) - Default: `300e3` - The timeout after which a request will time out, in milliseconds. Monitors time between receiving body data. Use `0` to disable it entirely. Defaults to 300 seconds. -* **headersTimeout** `number | null` (optional) - Default: `300e3` - The amount of time the parser will wait to receive the complete HTTP headers while not sending the request. Defaults to 300 seconds. -* **keepAliveMaxTimeout** `number | null` (optional) - Default: `600e3` - The maximum allowed `keepAliveTimeout` when overridden by *keep-alive* hints from the server. Defaults to 10 minutes. -* **keepAliveTimeout** `number | null` (optional) - Default: `4e3` - The timeout after which a socket without active requests will time out. Monitors time between activity on a connected socket. This value may be overridden by *keep-alive* hints from the server. See [MDN: HTTP - Headers - Keep-Alive directives](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Keep-Alive#directives) for more details. Defaults to 4 seconds. -* **keepAliveTimeoutThreshold** `number | null` (optional) - Default: `1e3` - A number subtracted from server *keep-alive* hints when overriding `keepAliveTimeout` to account for timing inaccuracies caused by e.g. transport latency. Defaults to 1 second. +* **headersTimeout** `number | null` (optional) - Default: `300e3` - The amount of time, in milliseconds, the parser will wait to receive the complete HTTP headers while not sending the request. Defaults to 300 seconds. +* **keepAliveMaxTimeout** `number | null` (optional) - Default: `600e3` - The maximum allowed `keepAliveTimeout`, in milliseconds, when overridden by *keep-alive* hints from the server. Defaults to 10 minutes. +* **keepAliveTimeout** `number | null` (optional) - Default: `4e3` - The timeout, in milliseconds, after which a socket without active requests will time out. Monitors time between activity on a connected socket. This value may be overridden by *keep-alive* hints from the server. See [MDN: HTTP - Headers - Keep-Alive directives](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Keep-Alive#directives) for more details. Defaults to 4 seconds. +* **keepAliveTimeoutThreshold** `number | null` (optional) - Default: `1e3` - A number of milliseconds subtracted from server *keep-alive* hints when overriding `keepAliveTimeout` to account for timing inaccuracies caused by e.g. transport latency. Defaults to 1 second. * **maxHeaderSize** `number | null` (optional) - Default: `16384` - The maximum length of request headers in bytes. Defaults to 16KiB. * **maxResponseSize** `number | null` (optional) - Default: `-1` - The maximum length of response body in bytes. Set to `-1` to disable. * **pipelining** `number | null` (optional) - Default: `1` - The amount of concurrent requests to be sent over the single TCP/TLS connection according to [RFC7230](https://tools.ietf.org/html/rfc7230#section-6.3.2). Carefully consider your workload and environment before enabling concurrent requests as pipelining may reduce performance if used incorrectly. Pipelining is sensitive to network stack settings as well as head of line blocking caused by e.g. long running requests. Set to `0` to disable keep-alive connections. @@ -30,6 +32,8 @@ Returns: `Client` * **interceptors** `{ Client: DispatchInterceptor[] }` - Default: `[RedirectInterceptor]` - A list of interceptors that are applied to the dispatch method. Additional logic can be applied (such as, but not limited to: 302 status code handling, authentication, cookies, compression and caching). Note that the behavior of interceptors is Experimental and might change at any given time. * **autoSelectFamily**: `boolean` (optional) - Default: depends on local Node version, on Node 18.13.0 and above is `false`. Enables a family autodetection algorithm that loosely implements section 5 of [RFC 8305](https://tools.ietf.org/html/rfc8305#section-5). See [here](https://nodejs.org/api/net.html#socketconnectoptions-connectlistener) for more details. This option is ignored if not supported by the current Node version. * **autoSelectFamilyAttemptTimeout**: `number` - Default: depends on local Node version, on Node 18.13.0 and above is `250`. The amount of time in milliseconds to wait for a connection attempt to finish before trying the next address when using the `autoSelectFamily` option. See [here](https://nodejs.org/api/net.html#socketconnectoptions-connectlistener) for more details. +* **allowH2**: `boolean` - Default: `false`. Enables support for H2 if the server has assigned bigger priority to it through ALPN negotiation. +* **maxConcurrentStreams**: `number` - Default: `100`. Dictates the maximum number of concurrent streams for a single H2 session. It can be overriden by a SETTINGS remote frame. #### Parameter: `ConnectOptions` @@ -38,7 +42,7 @@ Furthermore, the following options can be passed: * **socketPath** `string | null` (optional) - Default: `null` - An IPC endpoint, either Unix domain socket or Windows named pipe. * **maxCachedSessions** `number | null` (optional) - Default: `100` - Maximum number of TLS cached sessions. Use 0 to disable TLS session caching. Default: 100. -* **timeout** `number | null` (optional) - Default `10e3` +* **timeout** `number | null` (optional) - In milliseconds, Default `10e3`. * **servername** `string | null` (optional) * **keepAlive** `boolean | null` (optional) - Default: `true` - TCP keep-alive enabled * **keepAliveInitialDelay** `number | null` (optional) - Default: `60000` - TCP keep-alive interval for the socket in milliseconds diff --git a/deps/undici/src/docs/api/Connector.md b/deps/undici/src/docs/api/Connector.md index 7c966507e5fceb..56821bd6430279 100644 --- a/deps/undici/src/docs/api/Connector.md +++ b/deps/undici/src/docs/api/Connector.md @@ -13,8 +13,8 @@ Every Tls option, see [here](https://nodejs.org/api/tls.html#tls_tls_connect_opt Furthermore, the following options can be passed: * **socketPath** `string | null` (optional) - Default: `null` - An IPC endpoint, either Unix domain socket or Windows named pipe. -* **maxCachedSessions** `number | null` (optional) - Default: `100` - Maximum number of TLS cached sessions. Use 0 to disable TLS session caching. Default: 100. -* **timeout** `number | null` (optional) - Default `10e3` +* **maxCachedSessions** `number | null` (optional) - Default: `100` - Maximum number of TLS cached sessions. Use 0 to disable TLS session caching. Default: `100`. +* **timeout** `number | null` (optional) - In milliseconds. Default `10e3`. * **servername** `string | null` (optional) Once you call `buildConnector`, it will return a connector function, which takes the following parameters. diff --git a/deps/undici/src/docs/api/Dispatcher.md b/deps/undici/src/docs/api/Dispatcher.md index a50642948aaca1..fd463bfea16737 100644 --- a/deps/undici/src/docs/api/Dispatcher.md +++ b/deps/undici/src/docs/api/Dispatcher.md @@ -200,8 +200,9 @@ Returns: `Boolean` - `false` if dispatcher is busy and further dispatch calls wo * **blocking** `boolean` (optional) - Default: `false` - Whether the response is expected to take a long time and would end up blocking the pipeline. When this is set to `true` further pipelining will be avoided on the same connection until headers have been received. * **upgrade** `string | null` (optional) - Default: `null` - Upgrade the request. Should be used to specify the kind of upgrade i.e. `'Websocket'`. * **bodyTimeout** `number | null` (optional) - The timeout after which a request will time out, in milliseconds. Monitors time between receiving body data. Use `0` to disable it entirely. Defaults to 300 seconds. -* **headersTimeout** `number | null` (optional) - The amount of time the parser will wait to receive the complete HTTP headers while not sending the request. Defaults to 300 seconds. +* **headersTimeout** `number | null` (optional) - The amount of time, in milliseconds, the parser will wait to receive the complete HTTP headers while not sending the request. Defaults to 300 seconds. * **throwOnError** `boolean` (optional) - Default: `false` - Whether Undici should throw an error upon receiving a 4xx or 5xx response from the server. +* **expectContinue** `boolean` (optional) - Default: `false` - For H2, it appends the expect: 100-continue header, and halts the request body until a 100-continue is received from the remote server #### Parameter: `DispatchHandler` diff --git a/deps/undici/src/docs/api/MockPool.md b/deps/undici/src/docs/api/MockPool.md index 923c157aa64657..de53914002eca3 100644 --- a/deps/undici/src/docs/api/MockPool.md +++ b/deps/undici/src/docs/api/MockPool.md @@ -35,7 +35,8 @@ const mockPool = mockAgent.get('http://localhost:3000') ### `MockPool.intercept(options)` -This method defines the interception rules for matching against requests for a MockPool or MockPool. We can intercept multiple times on a single instance. +This method defines the interception rules for matching against requests for a MockPool or MockPool. We can intercept multiple times on a single instance, but each intercept is only used once. +For example if you expect to make 2 requests inside a test, you need to call `intercept()` twice. Assuming you use `disableNetConnect()` you will get `MockNotMatchedError` on the second request when you only call `intercept()` once. When defining interception rules, all the rules must pass for a request to be intercepted. If a request is not intercepted, a real request will be attempted. diff --git a/deps/undici/src/index-fetch.js b/deps/undici/src/index-fetch.js index 0d59d254f7d548..23ac530600769e 100644 --- a/deps/undici/src/index-fetch.js +++ b/deps/undici/src/index-fetch.js @@ -2,9 +2,9 @@ const fetchImpl = require('./lib/fetch').fetch -module.exports.fetch = async function fetch (resource) { +module.exports.fetch = async function fetch (resource, init = undefined) { try { - return await fetchImpl(...arguments) + return await fetchImpl(resource, init) } catch (err) { Error.captureStackTrace(err, this) throw err @@ -14,3 +14,4 @@ module.exports.FormData = require('./lib/fetch/formdata').FormData module.exports.Headers = require('./lib/fetch/headers').Headers module.exports.Response = require('./lib/fetch/response').Response module.exports.Request = require('./lib/fetch/request').Request +module.exports.WebSocket = require('./lib/websocket/websocket').WebSocket diff --git a/deps/undici/src/index.d.ts b/deps/undici/src/index.d.ts index 0730677b29e419..83a786d6a03131 100644 --- a/deps/undici/src/index.d.ts +++ b/deps/undici/src/index.d.ts @@ -1,57 +1,3 @@ -import Dispatcher from'./types/dispatcher' -import { setGlobalDispatcher, getGlobalDispatcher } from './types/global-dispatcher' -import { setGlobalOrigin, getGlobalOrigin } from './types/global-origin' -import Pool from'./types/pool' -import { RedirectHandler, DecoratorHandler } from './types/handlers' - -import BalancedPool from './types/balanced-pool' -import Client from'./types/client' -import buildConnector from'./types/connector' -import errors from'./types/errors' -import Agent from'./types/agent' -import MockClient from'./types/mock-client' -import MockPool from'./types/mock-pool' -import MockAgent from'./types/mock-agent' -import mockErrors from'./types/mock-errors' -import ProxyAgent from'./types/proxy-agent' -import { request, pipeline, stream, connect, upgrade } from './types/api' - -export * from './types/cookies' -export * from './types/fetch' -export * from './types/file' -export * from './types/filereader' -export * from './types/formdata' -export * from './types/diagnostics-channel' -export * from './types/websocket' -export * from './types/content-type' -export * from './types/cache' -export { Interceptable } from './types/mock-interceptor' - -export { Dispatcher, BalancedPool, Pool, Client, buildConnector, errors, Agent, request, stream, pipeline, connect, upgrade, setGlobalDispatcher, getGlobalDispatcher, setGlobalOrigin, getGlobalOrigin, MockClient, MockPool, MockAgent, mockErrors, ProxyAgent, RedirectHandler, DecoratorHandler } +export * from './types/index' +import Undici from './types/index' export default Undici - -declare namespace Undici { - var Dispatcher: typeof import('./types/dispatcher').default - var Pool: typeof import('./types/pool').default; - var RedirectHandler: typeof import ('./types/handlers').RedirectHandler - var DecoratorHandler: typeof import ('./types/handlers').DecoratorHandler - var createRedirectInterceptor: typeof import ('./types/interceptors').createRedirectInterceptor - var BalancedPool: typeof import('./types/balanced-pool').default; - var Client: typeof import('./types/client').default; - var buildConnector: typeof import('./types/connector').default; - var errors: typeof import('./types/errors').default; - var Agent: typeof import('./types/agent').default; - var setGlobalDispatcher: typeof import('./types/global-dispatcher').setGlobalDispatcher; - var getGlobalDispatcher: typeof import('./types/global-dispatcher').getGlobalDispatcher; - var request: typeof import('./types/api').request; - var stream: typeof import('./types/api').stream; - var pipeline: typeof import('./types/api').pipeline; - var connect: typeof import('./types/api').connect; - var upgrade: typeof import('./types/api').upgrade; - var MockClient: typeof import('./types/mock-client').default; - var MockPool: typeof import('./types/mock-pool').default; - var MockAgent: typeof import('./types/mock-agent').default; - var mockErrors: typeof import('./types/mock-errors').default; - var fetch: typeof import('./types/fetch').fetch; - var caches: typeof import('./types/cache').caches; -} diff --git a/deps/undici/src/index.js b/deps/undici/src/index.js index 7e8831ceeea3ea..7c0c8adcd6c809 100644 --- a/deps/undici/src/index.js +++ b/deps/undici/src/index.js @@ -106,7 +106,10 @@ if (util.nodeMajor > 16 || (util.nodeMajor === 16 && util.nodeMinor >= 8)) { try { return await fetchImpl(...arguments) } catch (err) { - Error.captureStackTrace(err, this) + if (typeof err === 'object') { + Error.captureStackTrace(err, this) + } + throw err } } diff --git a/deps/undici/src/lib/api/api-connect.js b/deps/undici/src/lib/api/api-connect.js index 0503b1a2f0eb10..fd2b6ad97a52d5 100644 --- a/deps/undici/src/lib/api/api-connect.js +++ b/deps/undici/src/lib/api/api-connect.js @@ -1,7 +1,7 @@ 'use strict' -const { InvalidArgumentError, RequestAbortedError, SocketError } = require('../core/errors') const { AsyncResource } = require('async_hooks') +const { InvalidArgumentError, RequestAbortedError, SocketError } = require('../core/errors') const util = require('../core/util') const { addSignal, removeSignal } = require('./abort-signal') @@ -50,7 +50,13 @@ class ConnectHandler extends AsyncResource { removeSignal(this) this.callback = null - const headers = this.responseHeaders === 'raw' ? util.parseRawHeaders(rawHeaders) : util.parseHeaders(rawHeaders) + + let headers = rawHeaders + // Indicates is an HTTP2Session + if (headers != null) { + headers = this.responseHeaders === 'raw' ? util.parseRawHeaders(rawHeaders) : util.parseHeaders(rawHeaders) + } + this.runInAsyncScope(callback, null, null, { statusCode, headers, diff --git a/deps/undici/src/lib/api/api-request.js b/deps/undici/src/lib/api/api-request.js index 71d7e926b4c395..f130ecc9867a88 100644 --- a/deps/undici/src/lib/api/api-request.js +++ b/deps/undici/src/lib/api/api-request.js @@ -95,7 +95,6 @@ class RequestHandler extends AsyncResource { this.callback = null this.res = body - if (callback !== null) { if (this.throwOnError && statusCode >= 400) { this.runInAsyncScope(getResolveErrorBodyCallback, null, diff --git a/deps/undici/src/lib/cache/cache.js b/deps/undici/src/lib/cache/cache.js index 18f06a348a0a88..9b3110860cd6b8 100644 --- a/deps/undici/src/lib/cache/cache.js +++ b/deps/undici/src/lib/cache/cache.js @@ -379,11 +379,7 @@ class Cache { const reader = stream.getReader() // 11.3 - readAllBytes( - reader, - (bytes) => bodyReadPromise.resolve(bytes), - (error) => bodyReadPromise.reject(error) - ) + readAllBytes(reader).then(bodyReadPromise.resolve, bodyReadPromise.reject) } else { bodyReadPromise.resolve(undefined) } diff --git a/deps/undici/src/lib/client.js b/deps/undici/src/lib/client.js index 7d9ec8d7c272b3..b5170d4f88da9b 100644 --- a/deps/undici/src/lib/client.js +++ b/deps/undici/src/lib/client.js @@ -6,6 +6,7 @@ const assert = require('assert') const net = require('net') +const { pipeline } = require('stream') const util = require('./core/util') const timers = require('./timers') const Request = require('./core/request') @@ -67,8 +68,40 @@ const { kDispatch, kInterceptors, kLocalAddress, - kMaxResponseSize + kMaxResponseSize, + kHTTPConnVersion, + // HTTP2 + kHost, + kHTTP2Session, + kHTTP2SessionState, + kHTTP2BuildRequest, + kHTTP2CopyHeaders, + kHTTP1BuildRequest } = require('./core/symbols') + +/** @type {import('http2')} */ +let http2 +try { + http2 = require('http2') +} catch { + // @ts-ignore + http2 = { constants: {} } +} + +const { + constants: { + HTTP2_HEADER_AUTHORITY, + HTTP2_HEADER_METHOD, + HTTP2_HEADER_PATH, + HTTP2_HEADER_CONTENT_LENGTH, + HTTP2_HEADER_EXPECT, + HTTP2_HEADER_STATUS + } +} = http2 + +// Experimental +let h2ExperimentalWarned = false + const FastBuffer = Buffer[Symbol.species] const kClosedResolve = Symbol('kClosedResolve') @@ -122,7 +155,10 @@ class Client extends DispatcherBase { localAddress, maxResponseSize, autoSelectFamily, - autoSelectFamilyAttemptTimeout + autoSelectFamilyAttemptTimeout, + // h2 + allowH2, + maxConcurrentStreams } = {}) { super() @@ -205,10 +241,20 @@ class Client extends DispatcherBase { throw new InvalidArgumentError('autoSelectFamilyAttemptTimeout must be a positive number') } + // h2 + if (allowH2 != null && typeof allowH2 !== 'boolean') { + throw new InvalidArgumentError('allowH2 must be a valid boolean value') + } + + if (maxConcurrentStreams != null && (typeof maxConcurrentStreams !== 'number' || maxConcurrentStreams < 1)) { + throw new InvalidArgumentError('maxConcurrentStreams must be a possitive integer, greater than 0') + } + if (typeof connect !== 'function') { connect = buildConnector({ ...tls, maxCachedSessions, + allowH2, socketPath, timeout: connectTimeout, ...(util.nodeHasAutoSelectFamily && autoSelectFamily ? { autoSelectFamily, autoSelectFamilyAttemptTimeout } : undefined), @@ -240,6 +286,18 @@ class Client extends DispatcherBase { this[kMaxRequests] = maxRequestsPerClient this[kClosedResolve] = null this[kMaxResponseSize] = maxResponseSize > -1 ? maxResponseSize : -1 + this[kHTTPConnVersion] = 'h1' + + // HTTP/2 + this[kHTTP2Session] = null + this[kHTTP2SessionState] = !allowH2 + ? null + : { + // streams: null, // Fixed queue of streams - For future support of `push` + openStreams: 0, // Keep track of them to decide wether or not unref the session + maxConcurrentStreams: maxConcurrentStreams != null ? maxConcurrentStreams : 100 // Max peerConcurrentStreams for a Node h2 server + } + this[kHost] = `${this[kUrl].hostname}${this[kUrl].port ? `:${this[kUrl].port}` : ''}` // kQueue is built up of 3 sections separated by // the kRunningIdx and kPendingIdx indices. @@ -298,7 +356,9 @@ class Client extends DispatcherBase { [kDispatch] (opts, handler) { const origin = opts.origin || this[kUrl].origin - const request = new Request(origin, opts, handler) + const request = this[kHTTPConnVersion] === 'h2' + ? Request[kHTTP2BuildRequest](origin, opts, handler) + : Request[kHTTP1BuildRequest](origin, opts, handler) this[kQueue].push(request) if (this[kResuming]) { @@ -319,6 +379,8 @@ class Client extends DispatcherBase { } async [kClose] () { + // TODO: for H2 we need to gracefully flush the remaining enqueued + // request and close each stream. return new Promise((resolve) => { if (!this[kSize]) { resolve(null) @@ -345,6 +407,12 @@ class Client extends DispatcherBase { resolve() } + if (this[kHTTP2Session] != null) { + util.destroy(this[kHTTP2Session], err) + this[kHTTP2Session] = null + this[kHTTP2SessionState] = null + } + if (!this[kSocket]) { queueMicrotask(callback) } else { @@ -356,6 +424,64 @@ class Client extends DispatcherBase { } } +function onHttp2SessionError (err) { + assert(err.code !== 'ERR_TLS_CERT_ALTNAME_INVALID') + + this[kSocket][kError] = err + + onError(this[kClient], err) +} + +function onHttp2FrameError (type, code, id) { + const err = new InformationalError(`HTTP/2: "frameError" received - type ${type}, code ${code}`) + + if (id === 0) { + this[kSocket][kError] = err + onError(this[kClient], err) + } +} + +function onHttp2SessionEnd () { + util.destroy(this, new SocketError('other side closed')) + util.destroy(this[kSocket], new SocketError('other side closed')) +} + +function onHTTP2GoAway (code) { + const client = this[kClient] + const err = new InformationalError(`HTTP/2: "GOAWAY" frame received with code ${code}`) + client[kSocket] = null + client[kHTTP2Session] = null + + if (client.destroyed) { + assert(this[kPending] === 0) + + // Fail entire queue. + const requests = client[kQueue].splice(client[kRunningIdx]) + for (let i = 0; i < requests.length; i++) { + const request = requests[i] + errorRequest(this, request, err) + } + } else if (client[kRunning] > 0) { + // Fail head of pipeline. + const request = client[kQueue][client[kRunningIdx]] + client[kQueue][client[kRunningIdx]++] = null + + errorRequest(client, request, err) + } + + client[kPendingIdx] = client[kRunningIdx] + + assert(client[kRunning] === 0) + + client.emit('disconnect', + client[kUrl], + [client], + err + ) + + resume(client) +} + const constants = require('./llhttp/constants') const createRedirectInterceptor = require('./interceptor/redirectInterceptor') const EMPTY_BUF = Buffer.alloc(0) @@ -946,16 +1072,18 @@ function onSocketReadable () { } function onSocketError (err) { - const { [kParser]: parser } = this + const { [kClient]: client, [kParser]: parser } = this assert(err.code !== 'ERR_TLS_CERT_ALTNAME_INVALID') - // On Mac OS, we get an ECONNRESET even if there is a full body to be forwarded - // to the user. - if (err.code === 'ECONNRESET' && parser.statusCode && !parser.shouldKeepAlive) { - // We treat all incoming data so for as a valid response. - parser.onMessageComplete() - return + if (client[kHTTPConnVersion] !== 'h2') { + // On Mac OS, we get an ECONNRESET even if there is a full body to be forwarded + // to the user. + if (err.code === 'ECONNRESET' && parser.statusCode && !parser.shouldKeepAlive) { + // We treat all incoming data so for as a valid response. + parser.onMessageComplete() + return + } } this[kError] = err @@ -984,27 +1112,31 @@ function onError (client, err) { } function onSocketEnd () { - const { [kParser]: parser } = this + const { [kParser]: parser, [kClient]: client } = this - if (parser.statusCode && !parser.shouldKeepAlive) { - // We treat all incoming data so far as a valid response. - parser.onMessageComplete() - return + if (client[kHTTPConnVersion] !== 'h2') { + if (parser.statusCode && !parser.shouldKeepAlive) { + // We treat all incoming data so far as a valid response. + parser.onMessageComplete() + return + } } util.destroy(this, new SocketError('other side closed', util.getSocketInfo(this))) } function onSocketClose () { - const { [kClient]: client } = this + const { [kClient]: client, [kParser]: parser } = this - if (!this[kError] && this[kParser].statusCode && !this[kParser].shouldKeepAlive) { - // We treat all incoming data so far as a valid response. - this[kParser].onMessageComplete() - } + if (client[kHTTPConnVersion] === 'h1' && parser) { + if (!this[kError] && parser.statusCode && !parser.shouldKeepAlive) { + // We treat all incoming data so far as a valid response. + parser.onMessageComplete() + } - this[kParser].destroy() - this[kParser] = null + this[kParser].destroy() + this[kParser] = null + } const err = this[kError] || new SocketError('closed', util.getSocketInfo(this)) @@ -1092,24 +1224,54 @@ async function connect (client) { return } - if (!llhttpInstance) { - llhttpInstance = await llhttpPromise - llhttpPromise = null - } - client[kConnecting] = false assert(socket) - socket[kNoRef] = false - socket[kWriting] = false - socket[kReset] = false - socket[kBlocking] = false - socket[kError] = null - socket[kParser] = new Parser(client, socket, llhttpInstance) - socket[kClient] = client + const isH2 = socket.alpnProtocol === 'h2' + if (isH2) { + if (!h2ExperimentalWarned) { + h2ExperimentalWarned = true + process.emitWarning('H2 support is experimental, expect them to change at any time.', { + code: 'UNDICI-H2' + }) + } + + const session = http2.connect(client[kUrl], { + createConnection: () => socket, + peerMaxConcurrentStreams: client[kHTTP2SessionState].maxConcurrentStreams + }) + + client[kHTTPConnVersion] = 'h2' + session[kClient] = client + session[kSocket] = socket + session.on('error', onHttp2SessionError) + session.on('frameError', onHttp2FrameError) + session.on('end', onHttp2SessionEnd) + session.on('goaway', onHTTP2GoAway) + session.on('close', onSocketClose) + session.unref() + + client[kHTTP2Session] = session + socket[kHTTP2Session] = session + } else { + if (!llhttpInstance) { + llhttpInstance = await llhttpPromise + llhttpPromise = null + } + + socket[kNoRef] = false + socket[kWriting] = false + socket[kReset] = false + socket[kBlocking] = false + socket[kParser] = new Parser(client, socket, llhttpInstance) + } + socket[kCounter] = 0 socket[kMaxRequests] = client[kMaxRequests] + socket[kClient] = client + socket[kError] = null + socket .on('error', onSocketError) .on('readable', onSocketReadable) @@ -1208,7 +1370,7 @@ function _resume (client, sync) { const socket = client[kSocket] - if (socket && !socket.destroyed) { + if (socket && !socket.destroyed && socket.alpnProtocol !== 'h2') { if (client[kSize] === 0) { if (!socket[kNoRef] && socket.unref) { socket.unref() @@ -1273,7 +1435,7 @@ function _resume (client, sync) { return } - if (!socket) { + if (!socket && !client[kHTTP2Session]) { connect(client) return } @@ -1334,6 +1496,11 @@ function _resume (client, sync) { } function write (client, request) { + if (client[kHTTPConnVersion] === 'h2') { + writeH2(client, client[kHTTP2Session], request) + return + } + const { body, method, path, host, upgrade, headers, blocking, reset } = request // https://tools.ietf.org/html/rfc7231#section-4.3.1 @@ -1489,9 +1656,291 @@ function write (client, request) { return true } -function writeStream ({ body, client, request, socket, contentLength, header, expectsPayload }) { +function writeH2 (client, session, request) { + const { body, method, path, host, upgrade, expectContinue, signal, headers: reqHeaders } = request + + let headers + if (typeof reqHeaders === 'string') headers = Request[kHTTP2CopyHeaders](reqHeaders.trim()) + else headers = reqHeaders + + if (upgrade) { + errorRequest(client, request, new Error('Upgrade not supported for H2')) + return false + } + + try { + // TODO(HTTP/2): Should we call onConnect immediately or on stream ready event? + request.onConnect((err) => { + if (request.aborted || request.completed) { + return + } + + errorRequest(client, request, err || new RequestAbortedError()) + }) + } catch (err) { + errorRequest(client, request, err) + } + + if (request.aborted) { + return false + } + + let stream + const h2State = client[kHTTP2SessionState] + + headers[HTTP2_HEADER_AUTHORITY] = host || client[kHost] + headers[HTTP2_HEADER_PATH] = path + + if (method === 'CONNECT') { + session.ref() + // we are already connected, streams are pending, first request + // will create a new stream. We trigger a request to create the stream and wait until + // `ready` event is triggered + // We disabled endStream to allow the user to write to the stream + stream = session.request(headers, { endStream: false, signal }) + + if (stream.id && !stream.pending) { + request.onUpgrade(null, null, stream) + ++h2State.openStreams + } else { + stream.once('ready', () => { + request.onUpgrade(null, null, stream) + ++h2State.openStreams + }) + } + + stream.once('close', () => { + h2State.openStreams -= 1 + // TODO(HTTP/2): unref only if current streams count is 0 + if (h2State.openStreams === 0) session.unref() + }) + + return true + } else { + headers[HTTP2_HEADER_METHOD] = method + } + + // https://tools.ietf.org/html/rfc7231#section-4.3.1 + // https://tools.ietf.org/html/rfc7231#section-4.3.2 + // https://tools.ietf.org/html/rfc7231#section-4.3.5 + + // Sending a payload body on a request that does not + // expect it can cause undefined behavior on some + // servers and corrupt connection state. Do not + // re-use the connection for further requests. + + const expectsPayload = ( + method === 'PUT' || + method === 'POST' || + method === 'PATCH' + ) + + if (body && typeof body.read === 'function') { + // Try to read EOF in order to get length. + body.read(0) + } + + let contentLength = util.bodyLength(body) + + if (contentLength == null) { + contentLength = request.contentLength + } + + if (contentLength === 0 || !expectsPayload) { + // https://tools.ietf.org/html/rfc7230#section-3.3.2 + // A user agent SHOULD NOT send a Content-Length header field when + // the request message does not contain a payload body and the method + // semantics do not anticipate such a body. + + contentLength = null + } + + if (request.contentLength != null && request.contentLength !== contentLength) { + if (client[kStrictContentLength]) { + errorRequest(client, request, new RequestContentLengthMismatchError()) + return false + } + + process.emitWarning(new RequestContentLengthMismatchError()) + } + + if (contentLength != null) { + assert(body, 'no body must not have content length') + headers[HTTP2_HEADER_CONTENT_LENGTH] = `${contentLength}` + } + + session.ref() + + const shouldEndStream = method === 'GET' || method === 'HEAD' + if (expectContinue) { + headers[HTTP2_HEADER_EXPECT] = '100-continue' + /** + * @type {import('node:http2').ClientHttp2Stream} + */ + stream = session.request(headers, { endStream: shouldEndStream, signal }) + + stream.once('continue', writeBodyH2) + } else { + /** @type {import('node:http2').ClientHttp2Stream} */ + stream = session.request(headers, { + endStream: shouldEndStream, + signal + }) + writeBodyH2() + } + + // Increment counter as we have new several streams open + ++h2State.openStreams + + stream.once('response', headers => { + if (request.onHeaders(Number(headers[HTTP2_HEADER_STATUS]), headers, stream.resume.bind(stream), '') === false) { + stream.pause() + } + }) + + stream.once('end', () => { + request.onComplete([]) + }) + + stream.on('data', (chunk) => { + if (request.onData(chunk) === false) stream.pause() + }) + + stream.once('close', () => { + h2State.openStreams -= 1 + // TODO(HTTP/2): unref only if current streams count is 0 + if (h2State.openStreams === 0) session.unref() + }) + + stream.once('error', function (err) { + if (client[kHTTP2Session] && !client[kHTTP2Session].destroyed && !this.closed && !this.destroyed) { + h2State.streams -= 1 + util.destroy(stream, err) + } + }) + + stream.once('frameError', (type, code) => { + const err = new InformationalError(`HTTP/2: "frameError" received - type ${type}, code ${code}`) + errorRequest(client, request, err) + + if (client[kHTTP2Session] && !client[kHTTP2Session].destroyed && !this.closed && !this.destroyed) { + h2State.streams -= 1 + util.destroy(stream, err) + } + }) + + // stream.on('aborted', () => { + // // TODO(HTTP/2): Support aborted + // }) + + // stream.on('timeout', () => { + // // TODO(HTTP/2): Support timeout + // }) + + // stream.on('push', headers => { + // // TODO(HTTP/2): Suppor push + // }) + + // stream.on('trailers', headers => { + // // TODO(HTTP/2): Support trailers + // }) + + return true + + function writeBodyH2 () { + /* istanbul ignore else: assertion */ + if (!body) { + request.onRequestSent() + } else if (util.isBuffer(body)) { + assert(contentLength === body.byteLength, 'buffer body must have content length') + stream.cork() + stream.write(body) + stream.uncork() + request.onBodySent(body) + request.onRequestSent() + } else if (util.isBlobLike(body)) { + if (typeof body.stream === 'function') { + writeIterable({ + client, + request, + contentLength, + h2stream: stream, + expectsPayload, + body: body.stream(), + socket: client[kSocket], + header: '' + }) + } else { + writeBlob({ + body, + client, + request, + contentLength, + expectsPayload, + h2stream: stream, + header: '', + socket: client[kSocket] + }) + } + } else if (util.isStream(body)) { + writeStream({ + body, + client, + request, + contentLength, + expectsPayload, + socket: client[kSocket], + h2stream: stream, + header: '' + }) + } else if (util.isIterable(body)) { + writeIterable({ + body, + client, + request, + contentLength, + expectsPayload, + header: '', + h2stream: stream, + socket: client[kSocket] + }) + } else { + assert(false) + } + } +} + +function writeStream ({ h2stream, body, client, request, socket, contentLength, header, expectsPayload }) { assert(contentLength !== 0 || client[kRunning] === 0, 'stream body cannot be pipelined') + if (client[kHTTPConnVersion] === 'h2') { + // For HTTP/2, is enough to pipe the stream + const pipe = pipeline( + body, + h2stream, + (err) => { + if (err) { + util.destroy(body, err) + util.destroy(h2stream, err) + } else { + request.onRequestSent() + } + } + ) + + pipe.on('data', onPipeData) + pipe.once('end', () => { + pipe.removeListener('data', onPipeData) + util.destroy(pipe) + }) + + function onPipeData (chunk) { + request.onBodySent(chunk) + } + + return + } + let finished = false const writer = new AsyncWriter({ socket, request, contentLength, client, expectsPayload, header }) @@ -1572,9 +2021,10 @@ function writeStream ({ body, client, request, socket, contentLength, header, ex .on('error', onFinished) } -async function writeBlob ({ body, client, request, socket, contentLength, header, expectsPayload }) { +async function writeBlob ({ h2stream, body, client, request, socket, contentLength, header, expectsPayload }) { assert(contentLength === body.size, 'blob body must have content length') + const isH2 = client[kHTTPConnVersion] === 'h2' try { if (contentLength != null && contentLength !== body.size) { throw new RequestContentLengthMismatchError() @@ -1582,10 +2032,16 @@ async function writeBlob ({ body, client, request, socket, contentLength, header const buffer = Buffer.from(await body.arrayBuffer()) - socket.cork() - socket.write(`${header}content-length: ${contentLength}\r\n\r\n`, 'latin1') - socket.write(buffer) - socket.uncork() + if (isH2) { + h2stream.cork() + h2stream.write(buffer) + h2stream.uncork() + } else { + socket.cork() + socket.write(`${header}content-length: ${contentLength}\r\n\r\n`, 'latin1') + socket.write(buffer) + socket.uncork() + } request.onBodySent(buffer) request.onRequestSent() @@ -1596,11 +2052,11 @@ async function writeBlob ({ body, client, request, socket, contentLength, header resume(client) } catch (err) { - util.destroy(socket, err) + util.destroy(isH2 ? h2stream : socket, err) } } -async function writeIterable ({ body, client, request, socket, contentLength, header, expectsPayload }) { +async function writeIterable ({ h2stream, body, client, request, socket, contentLength, header, expectsPayload }) { assert(contentLength !== 0 || client[kRunning] === 0, 'iterator body cannot be pipelined') let callback = null @@ -1622,6 +2078,33 @@ async function writeIterable ({ body, client, request, socket, contentLength, he } }) + if (client[kHTTPConnVersion] === 'h2') { + h2stream + .on('close', onDrain) + .on('drain', onDrain) + + try { + // It's up to the user to somehow abort the async iterable. + for await (const chunk of body) { + if (socket[kError]) { + throw socket[kError] + } + + if (!h2stream.write(chunk)) { + await waitForDrain() + } + } + } catch (err) { + h2stream.destroy(err) + } finally { + h2stream + .off('close', onDrain) + .off('drain', onDrain) + } + + return + } + socket .on('close', onDrain) .on('drain', onDrain) diff --git a/deps/undici/src/lib/core/connect.js b/deps/undici/src/lib/core/connect.js index f3b5cc33edd6cf..bb71085a1565fc 100644 --- a/deps/undici/src/lib/core/connect.js +++ b/deps/undici/src/lib/core/connect.js @@ -71,7 +71,7 @@ if (global.FinalizationRegistry) { } } -function buildConnector ({ maxCachedSessions, socketPath, timeout, ...opts }) { +function buildConnector ({ allowH2, maxCachedSessions, socketPath, timeout, ...opts }) { if (maxCachedSessions != null && (!Number.isInteger(maxCachedSessions) || maxCachedSessions < 0)) { throw new InvalidArgumentError('maxCachedSessions must be a positive integer or zero') } @@ -79,7 +79,7 @@ function buildConnector ({ maxCachedSessions, socketPath, timeout, ...opts }) { const options = { path: socketPath, ...opts } const sessionCache = new SessionCache(maxCachedSessions == null ? 100 : maxCachedSessions) timeout = timeout == null ? 10e3 : timeout - + allowH2 = allowH2 != null ? allowH2 : false return function connect ({ hostname, host, protocol, port, servername, localAddress, httpSocket }, callback) { let socket if (protocol === 'https:') { @@ -99,6 +99,8 @@ function buildConnector ({ maxCachedSessions, socketPath, timeout, ...opts }) { servername, session, localAddress, + // TODO(HTTP/2): Add support for h2c + ALPNProtocols: allowH2 ? ['http/1.1', 'h2'] : ['http/1.1'], socket: httpSocket, // upgrade socket connection port: port || 443, host: hostname diff --git a/deps/undici/src/lib/core/request.js b/deps/undici/src/lib/core/request.js index 6c9a24d5d590d7..e3b0c7b9dbf06c 100644 --- a/deps/undici/src/lib/core/request.js +++ b/deps/undici/src/lib/core/request.js @@ -5,6 +5,7 @@ const { NotSupportedError } = require('./errors') const assert = require('assert') +const { kHTTP2BuildRequest, kHTTP2CopyHeaders, kHTTP1BuildRequest } = require('./symbols') const util = require('./util') // tokenRegExp and headerCharRegex have been lifted from @@ -62,7 +63,8 @@ class Request { headersTimeout, bodyTimeout, reset, - throwOnError + throwOnError, + expectContinue }, handler) { if (typeof path !== 'string') { throw new InvalidArgumentError('path must be a string') @@ -98,6 +100,10 @@ class Request { throw new InvalidArgumentError('invalid reset') } + if (expectContinue != null && typeof expectContinue !== 'boolean') { + throw new InvalidArgumentError('invalid expectContinue') + } + this.headersTimeout = headersTimeout this.bodyTimeout = bodyTimeout @@ -150,6 +156,9 @@ class Request { this.headers = '' + // Only for H2 + this.expectContinue = expectContinue != null ? expectContinue : false + if (Array.isArray(headers)) { if (headers.length % 2 !== 0) { throw new InvalidArgumentError('headers array must be even') @@ -269,13 +278,64 @@ class Request { return this[kHandler].onError(error) } + // TODO: adjust to support H2 addHeader (key, value) { processHeader(this, key, value) return this } + + static [kHTTP1BuildRequest] (origin, opts, handler) { + // TODO: Migrate header parsing here, to make Requests + // HTTP agnostic + return new Request(origin, opts, handler) + } + + static [kHTTP2BuildRequest] (origin, opts, handler) { + const headers = opts.headers + opts = { ...opts, headers: null } + + const request = new Request(origin, opts, handler) + + request.headers = {} + + if (Array.isArray(headers)) { + if (headers.length % 2 !== 0) { + throw new InvalidArgumentError('headers array must be even') + } + for (let i = 0; i < headers.length; i += 2) { + processHeader(request, headers[i], headers[i + 1], true) + } + } else if (headers && typeof headers === 'object') { + const keys = Object.keys(headers) + for (let i = 0; i < keys.length; i++) { + const key = keys[i] + processHeader(request, key, headers[key], true) + } + } else if (headers != null) { + throw new InvalidArgumentError('headers must be an object or an array') + } + + return request + } + + static [kHTTP2CopyHeaders] (raw) { + const rawHeaders = raw.split('\r\n') + const headers = {} + + for (const header of rawHeaders) { + const [key, value] = header.split(': ') + + if (value == null || value.length === 0) continue + + if (headers[key]) headers[key] += `,${value}` + else headers[key] = value + } + + return headers + } } -function processHeaderValue (key, val) { +function processHeaderValue (key, val, skipAppend) { if (val && typeof val === 'object') { throw new InvalidArgumentError(`invalid ${key} header`) } @@ -286,10 +346,10 @@ function processHeaderValue (key, val) { throw new InvalidArgumentError(`invalid ${key} header`) } - return `${key}: ${val}\r\n` + return skipAppend ? val : `${key}: ${val}\r\n` } -function processHeader (request, key, val) { +function processHeader (request, key, val, skipAppend = false) { if (val && (typeof val === 'object' && !Array.isArray(val))) { throw new InvalidArgumentError(`invalid ${key} header`) } else if (val === undefined) { @@ -357,10 +417,16 @@ function processHeader (request, key, val) { } else { if (Array.isArray(val)) { for (let i = 0; i < val.length; i++) { - request.headers += processHeaderValue(key, val[i]) + if (skipAppend) { + if (request.headers[key]) request.headers[key] += `,${processHeaderValue(key, val[i], skipAppend)}` + else request.headers[key] = processHeaderValue(key, val[i], skipAppend) + } else { + request.headers += processHeaderValue(key, val[i]) + } } } else { - request.headers += processHeaderValue(key, val) + if (skipAppend) request.headers[key] = processHeaderValue(key, val, skipAppend) + else request.headers += processHeaderValue(key, val) } } } diff --git a/deps/undici/src/lib/core/symbols.js b/deps/undici/src/lib/core/symbols.js index c852107a72af26..c2492f4355fd2a 100644 --- a/deps/undici/src/lib/core/symbols.js +++ b/deps/undici/src/lib/core/symbols.js @@ -51,5 +51,11 @@ module.exports = { kProxy: Symbol('proxy agent options'), kCounter: Symbol('socket request counter'), kInterceptors: Symbol('dispatch interceptors'), - kMaxResponseSize: Symbol('max response size') + kMaxResponseSize: Symbol('max response size'), + kHTTP2Session: Symbol('http2Session'), + kHTTP2SessionState: Symbol('http2Session state'), + kHTTP2BuildRequest: Symbol('http2 build request'), + kHTTP1BuildRequest: Symbol('http1 build request'), + kHTTP2CopyHeaders: Symbol('http2 copy headers'), + kHTTPConnVersion: Symbol('http connection version') } diff --git a/deps/undici/src/lib/core/util.js b/deps/undici/src/lib/core/util.js index 4f8c1f8f1a1a4a..259ba7b38a64e9 100644 --- a/deps/undici/src/lib/core/util.js +++ b/deps/undici/src/lib/core/util.js @@ -168,7 +168,7 @@ function bodyLength (body) { return 0 } else if (isStream(body)) { const state = body._readableState - return state && state.ended === true && Number.isFinite(state.length) + return state && state.objectMode === false && state.ended === true && Number.isFinite(state.length) ? state.length : null } else if (isBlobLike(body)) { @@ -199,6 +199,7 @@ function destroy (stream, err) { // See: https://github.com/nodejs/node/pull/38505/files stream.socket = null } + stream.destroy(err) } else if (err) { process.nextTick((stream, err) => { @@ -218,6 +219,9 @@ function parseKeepAliveTimeout (val) { } function parseHeaders (headers, obj = {}) { + // For H2 support + if (!Array.isArray(headers)) return headers + for (let i = 0; i < headers.length; i += 2) { const key = headers[i].toString().toLowerCase() let val = obj[key] @@ -355,6 +359,12 @@ function getSocketInfo (socket) { } } +async function * convertIterableToBuffer (iterable) { + for await (const chunk of iterable) { + yield Buffer.isBuffer(chunk) ? chunk : Buffer.from(chunk) + } +} + let ReadableStream function ReadableStreamFrom (iterable) { if (!ReadableStream) { @@ -362,8 +372,7 @@ function ReadableStreamFrom (iterable) { } if (ReadableStream.from) { - // https://github.com/whatwg/streams/pull/1083 - return ReadableStream.from(iterable) + return ReadableStream.from(convertIterableToBuffer(iterable)) } let iterator diff --git a/deps/undici/src/lib/fetch/body.js b/deps/undici/src/lib/fetch/body.js index 0c7b8b65363f43..105eb553157b06 100644 --- a/deps/undici/src/lib/fetch/body.js +++ b/deps/undici/src/lib/fetch/body.js @@ -387,6 +387,7 @@ function bodyMixinMethods (instance) { try { busboy = Busboy({ headers, + preservePath: true, defParamCharset: 'utf8' }) } catch (err) { @@ -532,7 +533,7 @@ async function specConsumeBody (object, convertBytesToJSValue, instance) { // 6. Otherwise, fully read object’s body given successSteps, // errorSteps, and object’s relevant global object. - fullyReadBody(object[kState].body, successSteps, errorSteps) + await fullyReadBody(object[kState].body, successSteps, errorSteps) // 7. Return promise. return promise.promise diff --git a/deps/undici/src/lib/fetch/index.js b/deps/undici/src/lib/fetch/index.js index d615f07ea272d1..50f1b9f3fcdcc1 100644 --- a/deps/undici/src/lib/fetch/index.js +++ b/deps/undici/src/lib/fetch/index.js @@ -1760,7 +1760,7 @@ async function httpNetworkFetch ( fetchParams.controller.connection.destroy() // 2. Return the appropriate network error for fetchParams. - return makeAppropriateNetworkError(fetchParams) + return makeAppropriateNetworkError(fetchParams, err) } return makeNetworkError(err) @@ -1979,19 +1979,37 @@ async function httpNetworkFetch ( let location = '' const headers = new Headers() - for (let n = 0; n < headersList.length; n += 2) { - const key = headersList[n + 0].toString('latin1') - const val = headersList[n + 1].toString('latin1') - if (key.toLowerCase() === 'content-encoding') { - // https://www.rfc-editor.org/rfc/rfc7231#section-3.1.2.1 - // "All content-coding values are case-insensitive..." - codings = val.toLowerCase().split(',').map((x) => x.trim()).reverse() - } else if (key.toLowerCase() === 'location') { - location = val + // For H2, the headers are a plain JS object + // We distinguish between them and iterate accordingly + if (Array.isArray(headersList)) { + for (let n = 0; n < headersList.length; n += 2) { + const key = headersList[n + 0].toString('latin1') + const val = headersList[n + 1].toString('latin1') + if (key.toLowerCase() === 'content-encoding') { + // https://www.rfc-editor.org/rfc/rfc7231#section-3.1.2.1 + // "All content-coding values are case-insensitive..." + codings = val.toLowerCase().split(',').map((x) => x.trim()) + } else if (key.toLowerCase() === 'location') { + location = val + } + + headers.append(key, val) } + } else { + const keys = Object.keys(headersList) + for (const key of keys) { + const val = headersList[key] + if (key.toLowerCase() === 'content-encoding') { + // https://www.rfc-editor.org/rfc/rfc7231#section-3.1.2.1 + // "All content-coding values are case-insensitive..." + codings = val.toLowerCase().split(',').map((x) => x.trim()).reverse() + } else if (key.toLowerCase() === 'location') { + location = val + } - headers.append(key, val) + headers.append(key, val) + } } this.body = new Readable({ read: resume }) diff --git a/deps/undici/src/lib/fetch/response.js b/deps/undici/src/lib/fetch/response.js index 1029dbef53371f..88deb71a06285e 100644 --- a/deps/undici/src/lib/fetch/response.js +++ b/deps/undici/src/lib/fetch/response.js @@ -49,7 +49,7 @@ class Response { } // https://fetch.spec.whatwg.org/#dom-response-json - static json (data = undefined, init = {}) { + static json (data, init = {}) { webidl.argumentLengthCheck(arguments, 1, { header: 'Response.json' }) if (init !== null) { @@ -426,15 +426,15 @@ function filterResponse (response, type) { } // https://fetch.spec.whatwg.org/#appropriate-network-error -function makeAppropriateNetworkError (fetchParams) { +function makeAppropriateNetworkError (fetchParams, err = null) { // 1. Assert: fetchParams is canceled. assert(isCancelled(fetchParams)) // 2. Return an aborted network error if fetchParams is aborted; // otherwise return a network error. return isAborted(fetchParams) - ? makeNetworkError(new DOMException('The operation was aborted.', 'AbortError')) - : makeNetworkError('Request was cancelled.') + ? makeNetworkError(Object.assign(new DOMException('The operation was aborted.', 'AbortError'), { cause: err })) + : makeNetworkError(Object.assign(new DOMException('Request was cancelled.'), { cause: err })) } // https://whatpr.org/fetch/1392.html#initialize-a-response diff --git a/deps/undici/src/lib/fetch/util.js b/deps/undici/src/lib/fetch/util.js index 400687ba2e7d23..fcbba84bc9a8b0 100644 --- a/deps/undici/src/lib/fetch/util.js +++ b/deps/undici/src/lib/fetch/util.js @@ -556,16 +556,37 @@ function bytesMatch (bytes, metadataList) { const algorithm = item.algo // 2. Let expectedValue be the val component of item. - const expectedValue = item.hash + let expectedValue = item.hash + + // See https://github.com/web-platform-tests/wpt/commit/e4c5cc7a5e48093220528dfdd1c4012dc3837a0e + // "be liberal with padding". This is annoying, and it's not even in the spec. + + if (expectedValue.endsWith('==')) { + expectedValue = expectedValue.slice(0, -2) + } // 3. Let actualValue be the result of applying algorithm to bytes. - const actualValue = crypto.createHash(algorithm).update(bytes).digest('base64') + let actualValue = crypto.createHash(algorithm).update(bytes).digest('base64') + + if (actualValue.endsWith('==')) { + actualValue = actualValue.slice(0, -2) + } // 4. If actualValue is a case-sensitive match for expectedValue, // return true. if (actualValue === expectedValue) { return true } + + let actualBase64URL = crypto.createHash(algorithm).update(bytes).digest('base64url') + + if (actualBase64URL.endsWith('==')) { + actualBase64URL = actualBase64URL.slice(0, -2) + } + + if (actualBase64URL === expectedValue) { + return true + } } // 6. Return false. @@ -812,17 +833,17 @@ function iteratorResult (pair, kind) { /** * @see https://fetch.spec.whatwg.org/#body-fully-read */ -function fullyReadBody (body, processBody, processBodyError) { +async function fullyReadBody (body, processBody, processBodyError) { // 1. If taskDestination is null, then set taskDestination to // the result of starting a new parallel queue. // 2. Let successSteps given a byte sequence bytes be to queue a // fetch task to run processBody given bytes, with taskDestination. - const successSteps = (bytes) => queueMicrotask(() => processBody(bytes)) + const successSteps = processBody // 3. Let errorSteps be to queue a fetch task to run processBodyError, // with taskDestination. - const errorSteps = (error) => queueMicrotask(() => processBodyError(error)) + const errorSteps = processBodyError // 4. Let reader be the result of getting a reader for body’s stream. // If that threw an exception, then run errorSteps with that @@ -837,7 +858,12 @@ function fullyReadBody (body, processBody, processBodyError) { } // 5. Read all bytes from reader, given successSteps and errorSteps. - readAllBytes(reader, successSteps, errorSteps) + try { + const result = await readAllBytes(reader) + successSteps(result) + } catch (e) { + errorSteps(e) + } } /** @type {ReadableStream} */ @@ -906,36 +932,23 @@ function isomorphicEncode (input) { * @see https://streams.spec.whatwg.org/#readablestreamdefaultreader-read-all-bytes * @see https://streams.spec.whatwg.org/#read-loop * @param {ReadableStreamDefaultReader} reader - * @param {(bytes: Uint8Array) => void} successSteps - * @param {(error: Error) => void} failureSteps */ -async function readAllBytes (reader, successSteps, failureSteps) { +async function readAllBytes (reader) { const bytes = [] let byteLength = 0 while (true) { - let done - let chunk - - try { - ({ done, value: chunk } = await reader.read()) - } catch (e) { - // 1. Call failureSteps with e. - failureSteps(e) - return - } + const { done, value: chunk } = await reader.read() if (done) { // 1. Call successSteps with bytes. - successSteps(Buffer.concat(bytes, byteLength)) - return + return Buffer.concat(bytes, byteLength) } // 1. If chunk is not a Uint8Array object, call failureSteps // with a TypeError and abort these steps. if (!isUint8Array(chunk)) { - failureSteps(new TypeError('Received non-Uint8Array chunk')) - return + throw new TypeError('Received non-Uint8Array chunk') } // 2. Append the bytes represented by chunk to bytes. diff --git a/deps/undici/src/lib/pool.js b/deps/undici/src/lib/pool.js index 93b3158f21a131..08509958069a4f 100644 --- a/deps/undici/src/lib/pool.js +++ b/deps/undici/src/lib/pool.js @@ -34,6 +34,7 @@ class Pool extends PoolBase { socketPath, autoSelectFamily, autoSelectFamilyAttemptTimeout, + allowH2, ...options } = {}) { super() @@ -54,6 +55,7 @@ class Pool extends PoolBase { connect = buildConnector({ ...tls, maxCachedSessions, + allowH2, socketPath, timeout: connectTimeout == null ? 10e3 : connectTimeout, ...(util.nodeHasAutoSelectFamily && autoSelectFamily ? { autoSelectFamily, autoSelectFamilyAttemptTimeout } : undefined), @@ -66,7 +68,7 @@ class Pool extends PoolBase { : [] this[kConnections] = connections || null this[kUrl] = util.parseOrigin(origin) - this[kOptions] = { ...util.deepClone(options), connect } + this[kOptions] = { ...util.deepClone(options), connect, allowH2 } this[kOptions].interceptors = options.interceptors ? { ...options.interceptors } : undefined diff --git a/deps/undici/src/lib/websocket/connection.js b/deps/undici/src/lib/websocket/connection.js index 8c821899f6553e..e0fa69726b4054 100644 --- a/deps/undici/src/lib/websocket/connection.js +++ b/deps/undici/src/lib/websocket/connection.js @@ -1,6 +1,5 @@ 'use strict' -const { randomBytes, createHash } = require('crypto') const diagnosticsChannel = require('diagnostics_channel') const { uid, states } = require('./constants') const { @@ -22,6 +21,14 @@ channels.open = diagnosticsChannel.channel('undici:websocket:open') channels.close = diagnosticsChannel.channel('undici:websocket:close') channels.socketError = diagnosticsChannel.channel('undici:websocket:socket_error') +/** @type {import('crypto')} */ +let crypto +try { + crypto = require('crypto') +} catch { + +} + /** * @see https://websockets.spec.whatwg.org/#concept-websocket-establish * @param {URL} url @@ -66,7 +73,7 @@ function establishWebSocketConnection (url, protocols, ws, onEstablish, options) // 5. Let keyValue be a nonce consisting of a randomly selected // 16-byte value that has been forgiving-base64-encoded and // isomorphic encoded. - const keyValue = randomBytes(16).toString('base64') + const keyValue = crypto.randomBytes(16).toString('base64') // 6. Append (`Sec-WebSocket-Key`, keyValue) to request’s // header list. @@ -148,7 +155,7 @@ function establishWebSocketConnection (url, protocols, ws, onEstablish, options) // trailing whitespace, the client MUST _Fail the WebSocket // Connection_. const secWSAccept = response.headersList.get('Sec-WebSocket-Accept') - const digest = createHash('sha1').update(keyValue + uid).digest('base64') + const digest = crypto.createHash('sha1').update(keyValue + uid).digest('base64') if (secWSAccept !== digest) { failWebsocketConnection(ws, 'Incorrect hash received in Sec-WebSocket-Accept header.') return diff --git a/deps/undici/src/lib/websocket/frame.js b/deps/undici/src/lib/websocket/frame.js index 61bfd3915cecc5..d867ad118b29b8 100644 --- a/deps/undici/src/lib/websocket/frame.js +++ b/deps/undici/src/lib/websocket/frame.js @@ -1,15 +1,22 @@ 'use strict' -const { randomBytes } = require('crypto') const { maxUnsigned16Bit } = require('./constants') +/** @type {import('crypto')} */ +let crypto +try { + crypto = require('crypto') +} catch { + +} + class WebsocketFrameSend { /** * @param {Buffer|undefined} data */ constructor (data) { this.frameData = data - this.maskKey = randomBytes(4) + this.maskKey = crypto.randomBytes(4) } createFrame (opcode) { diff --git a/deps/undici/src/lib/websocket/websocket.js b/deps/undici/src/lib/websocket/websocket.js index 22ad2fb11a1910..e4aa58f52fc589 100644 --- a/deps/undici/src/lib/websocket/websocket.js +++ b/deps/undici/src/lib/websocket/websocket.js @@ -3,6 +3,7 @@ const { webidl } = require('../fetch/webidl') const { DOMException } = require('../fetch/constants') const { URLSerializer } = require('../fetch/dataURL') +const { getGlobalOrigin } = require('../fetch/global') const { staticPropertyDescriptors, states, opcodes, emptyBuffer } = require('./constants') const { kWebSocketURL, @@ -57,18 +58,28 @@ class WebSocket extends EventTarget { url = webidl.converters.USVString(url) protocols = options.protocols - // 1. Let urlRecord be the result of applying the URL parser to url. + // 1. Let baseURL be this's relevant settings object's API base URL. + const baseURL = getGlobalOrigin() + + // 1. Let urlRecord be the result of applying the URL parser to url with baseURL. let urlRecord try { - urlRecord = new URL(url) + urlRecord = new URL(url, baseURL) } catch (e) { - // 2. If urlRecord is failure, then throw a "SyntaxError" DOMException. + // 3. If urlRecord is failure, then throw a "SyntaxError" DOMException. throw new DOMException(e, 'SyntaxError') } - // 3. If urlRecord’s scheme is not "ws" or "wss", then throw a - // "SyntaxError" DOMException. + // 4. If urlRecord’s scheme is "http", then set urlRecord’s scheme to "ws". + if (urlRecord.protocol === 'http:') { + urlRecord.protocol = 'ws:' + } else if (urlRecord.protocol === 'https:') { + // 5. Otherwise, if urlRecord’s scheme is "https", set urlRecord’s scheme to "wss". + urlRecord.protocol = 'wss:' + } + + // 6. If urlRecord’s scheme is not "ws" or "wss", then throw a "SyntaxError" DOMException. if (urlRecord.protocol !== 'ws:' && urlRecord.protocol !== 'wss:') { throw new DOMException( `Expected a ws: or wss: protocol, got ${urlRecord.protocol}`, @@ -76,19 +87,19 @@ class WebSocket extends EventTarget { ) } - // 4. If urlRecord’s fragment is non-null, then throw a "SyntaxError" + // 7. If urlRecord’s fragment is non-null, then throw a "SyntaxError" // DOMException. - if (urlRecord.hash) { + if (urlRecord.hash || urlRecord.href.endsWith('#')) { throw new DOMException('Got fragment', 'SyntaxError') } - // 5. If protocols is a string, set protocols to a sequence consisting + // 8. If protocols is a string, set protocols to a sequence consisting // of just that string. if (typeof protocols === 'string') { protocols = [protocols] } - // 6. If any of the values in protocols occur more than once or otherwise + // 9. If any of the values in protocols occur more than once or otherwise // fail to match the requirements for elements that comprise the value // of `Sec-WebSocket-Protocol` fields as defined by The WebSocket // protocol, then throw a "SyntaxError" DOMException. @@ -100,12 +111,12 @@ class WebSocket extends EventTarget { throw new DOMException('Invalid Sec-WebSocket-Protocol value', 'SyntaxError') } - // 7. Set this's url to urlRecord. - this[kWebSocketURL] = urlRecord + // 10. Set this's url to urlRecord. + this[kWebSocketURL] = new URL(urlRecord.href) - // 8. Let client be this's relevant settings object. + // 11. Let client be this's relevant settings object. - // 9. Run this step in parallel: + // 12. Run this step in parallel: // 1. Establish a WebSocket connection given urlRecord, protocols, // and client. diff --git a/deps/undici/src/package.json b/deps/undici/src/package.json index 598a78654a9845..3846b9dc3988c5 100644 --- a/deps/undici/src/package.json +++ b/deps/undici/src/package.json @@ -1,6 +1,6 @@ { "name": "undici", - "version": "5.23.0", + "version": "5.25.2", "description": "An HTTP/1.1 client, written from scratch for Node.js", "homepage": "https://undici.nodejs.org", "bugs": { @@ -11,12 +11,41 @@ "url": "git+https://github.com/nodejs/undici.git" }, "license": "MIT", - "author": "Matteo Collina ", "contributors": [ + { + "name": "Daniele Belardi", + "url": "https://github.com/dnlup", + "author": true + }, + { + "name": "Ethan Arrowood", + "url": "https://github.com/ethan-arrowood", + "author": true + }, + { + "name": "Matteo Collina", + "url": "https://github.com/mcollina", + "author": true + }, + { + "name": "Matthew Aitken", + "url": "https://github.com/KhafraDev", + "author": true + }, { "name": "Robert Nagy", "url": "https://github.com/ronag", "author": true + }, + { + "name": "Szymon Marczak", + "url": "https://github.com/szmarczak", + "author": true + }, + { + "name": "Tomas Della Vedova", + "url": "https://github.com/delvedor", + "author": true } ], "keywords": [ @@ -64,10 +93,11 @@ "bench:run": "CONNECTIONS=1 node benchmarks/benchmark.js; CONNECTIONS=50 node benchmarks/benchmark.js", "serve:website": "docsify serve .", "prepare": "husky install", + "postpublish": "node scripts/update-undici-types-version.js && cd types && npm publish", "fuzz": "jsfuzz test/fuzzing/fuzz.js corpus" }, "devDependencies": { - "@sinonjs/fake-timers": "^10.0.2", + "@sinonjs/fake-timers": "^11.1.0", "@types/node": "^18.0.3", "abort-controller": "^3.0.0", "atomic-sleep": "^1.0.0", @@ -98,7 +128,7 @@ "standard": "^17.0.0", "table": "^6.8.0", "tap": "^16.1.0", - "tsd": "^0.28.1", + "tsd": "^0.29.0", "typescript": "^5.0.2", "wait-on": "^7.0.1", "ws": "^8.11.0" diff --git a/deps/undici/src/types/README.md b/deps/undici/src/types/README.md new file mode 100644 index 00000000000000..20a721c445a21b --- /dev/null +++ b/deps/undici/src/types/README.md @@ -0,0 +1,6 @@ +# undici-types + +This package is a dual-publish of the [undici](https://www.npmjs.com/package/undici) library types. The `undici` package **still contains types**. This package is for users who _only_ need undici types (such as for `@types/node`). It is published alongside every release of `undici`, so you can always use the same version. + +- [GitHub nodejs/undici](https://github.com/nodejs/undici) +- [Undici Documentation](https://undici.nodejs.org/#/) diff --git a/deps/undici/src/types/client.d.ts b/deps/undici/src/types/client.d.ts index 56074a15ae7a13..ac1779721f6a2c 100644 --- a/deps/undici/src/types/client.d.ts +++ b/deps/undici/src/types/client.d.ts @@ -1,7 +1,6 @@ import { URL } from 'url' import { TlsOptions } from 'tls' import Dispatcher from './dispatcher' -import DispatchInterceptor from './dispatcher' import buildConnector from "./connector"; /** @@ -19,14 +18,14 @@ export class Client extends Dispatcher { export declare namespace Client { export interface OptionsInterceptors { - Client: readonly DispatchInterceptor[]; + Client: readonly Dispatcher.DispatchInterceptor[]; } export interface Options { /** TODO */ interceptors?: OptionsInterceptors; /** The maximum length of request headers in bytes. Default: `16384` (16KiB). */ maxHeaderSize?: number; - /** The amount of time the parser will wait to receive the complete HTTP headers (Node 14 and above only). Default: `300e3` milliseconds (300s). */ + /** The amount of time, in milliseconds, the parser will wait to receive the complete HTTP headers (Node 14 and above only). Default: `300e3` milliseconds (300s). */ headersTimeout?: number; /** @deprecated unsupported socketTimeout, use headersTimeout & bodyTimeout instead */ socketTimeout?: never; @@ -40,13 +39,13 @@ export declare namespace Client { idleTimeout?: never; /** @deprecated unsupported keepAlive, use pipelining=0 instead */ keepAlive?: never; - /** the timeout after which a socket without active requests will time out. Monitors time between activity on a connected socket. This value may be overridden by *keep-alive* hints from the server. Default: `4e3` milliseconds (4s). */ + /** the timeout, in milliseconds, after which a socket without active requests will time out. Monitors time between activity on a connected socket. This value may be overridden by *keep-alive* hints from the server. Default: `4e3` milliseconds (4s). */ keepAliveTimeout?: number; /** @deprecated unsupported maxKeepAliveTimeout, use keepAliveMaxTimeout instead */ maxKeepAliveTimeout?: never; - /** the maximum allowed `idleTimeout` when overridden by *keep-alive* hints from the server. Default: `600e3` milliseconds (10min). */ + /** the maximum allowed `idleTimeout`, in milliseconds, when overridden by *keep-alive* hints from the server. Default: `600e3` milliseconds (10min). */ keepAliveMaxTimeout?: number; - /** A number subtracted from server *keep-alive* hints when overriding `idleTimeout` to account for timing inaccuracies caused by e.g. transport latency. Default: `1e3` milliseconds (1s). */ + /** A number of milliseconds subtracted from server *keep-alive* hints when overriding `idleTimeout` to account for timing inaccuracies caused by e.g. transport latency. Default: `1e3` milliseconds (1s). */ keepAliveTimeoutThreshold?: number; /** TODO */ socketPath?: string; @@ -71,7 +70,17 @@ export declare namespace Client { /** Enables a family autodetection algorithm that loosely implements section 5 of RFC 8305. */ autoSelectFamily?: boolean; /** The amount of time in milliseconds to wait for a connection attempt to finish before trying the next address when using the `autoSelectFamily` option. */ - autoSelectFamilyAttemptTimeout?: number; + autoSelectFamilyAttemptTimeout?: number; + /** + * @description Enables support for H2 if the server has assigned bigger priority to it through ALPN negotiation. + * @default false + */ + allowH2?: boolean; + /** + * @description Dictates the maximum number of concurrent streams for a single H2 session. It can be overriden by a SETTINGS remote frame. + * @default 100 + */ + maxConcurrentStreams?: number } export interface SocketInfo { localAddress?: string diff --git a/deps/undici/src/types/dispatcher.d.ts b/deps/undici/src/types/dispatcher.d.ts index 7f621371f86ec1..816db19d20d878 100644 --- a/deps/undici/src/types/dispatcher.d.ts +++ b/deps/undici/src/types/dispatcher.d.ts @@ -109,7 +109,7 @@ declare namespace Dispatcher { blocking?: boolean; /** Upgrade the request. Should be used to specify the kind of upgrade i.e. `'Websocket'`. Default: `method === 'CONNECT' || null`. */ upgrade?: boolean | string | null; - /** The amount of time the parser will wait to receive the complete HTTP headers. Defaults to 300 seconds. */ + /** The amount of time, in milliseconds, the parser will wait to receive the complete HTTP headers. Defaults to 300 seconds. */ headersTimeout?: number | null; /** The timeout after which a request will time out, in milliseconds. Monitors time between receiving body data. Use 0 to disable it entirely. Defaults to 300 seconds. */ bodyTimeout?: number | null; @@ -117,6 +117,8 @@ declare namespace Dispatcher { reset?: boolean; /** Whether Undici should throw an error upon receiving a 4xx or 5xx response from the server. Defaults to false */ throwOnError?: boolean; + /** For H2, it appends the expect: 100-continue header, and halts the request body until a 100-continue is received from the remote server*/ + expectContinue?: boolean; } export interface ConnectOptions { path: string; diff --git a/deps/undici/src/types/index.d.ts b/deps/undici/src/types/index.d.ts new file mode 100644 index 00000000000000..c7532d69a073cc --- /dev/null +++ b/deps/undici/src/types/index.d.ts @@ -0,0 +1,57 @@ +import Dispatcher from'./dispatcher' +import { setGlobalDispatcher, getGlobalDispatcher } from './global-dispatcher' +import { setGlobalOrigin, getGlobalOrigin } from './global-origin' +import Pool from'./pool' +import { RedirectHandler, DecoratorHandler } from './handlers' + +import BalancedPool from './balanced-pool' +import Client from'./client' +import buildConnector from'./connector' +import errors from'./errors' +import Agent from'./agent' +import MockClient from'./mock-client' +import MockPool from'./mock-pool' +import MockAgent from'./mock-agent' +import mockErrors from'./mock-errors' +import ProxyAgent from'./proxy-agent' +import { request, pipeline, stream, connect, upgrade } from './api' + +export * from './cookies' +export * from './fetch' +export * from './file' +export * from './filereader' +export * from './formdata' +export * from './diagnostics-channel' +export * from './websocket' +export * from './content-type' +export * from './cache' +export { Interceptable } from './mock-interceptor' + +export { Dispatcher, BalancedPool, Pool, Client, buildConnector, errors, Agent, request, stream, pipeline, connect, upgrade, setGlobalDispatcher, getGlobalDispatcher, setGlobalOrigin, getGlobalOrigin, MockClient, MockPool, MockAgent, mockErrors, ProxyAgent, RedirectHandler, DecoratorHandler } +export default Undici + +declare namespace Undici { + var Dispatcher: typeof import('./dispatcher').default + var Pool: typeof import('./pool').default; + var RedirectHandler: typeof import ('./handlers').RedirectHandler + var DecoratorHandler: typeof import ('./handlers').DecoratorHandler + var createRedirectInterceptor: typeof import ('./interceptors').createRedirectInterceptor + var BalancedPool: typeof import('./balanced-pool').default; + var Client: typeof import('./client').default; + var buildConnector: typeof import('./connector').default; + var errors: typeof import('./errors').default; + var Agent: typeof import('./agent').default; + var setGlobalDispatcher: typeof import('./global-dispatcher').setGlobalDispatcher; + var getGlobalDispatcher: typeof import('./global-dispatcher').getGlobalDispatcher; + var request: typeof import('./api').request; + var stream: typeof import('./api').stream; + var pipeline: typeof import('./api').pipeline; + var connect: typeof import('./api').connect; + var upgrade: typeof import('./api').upgrade; + var MockClient: typeof import('./mock-client').default; + var MockPool: typeof import('./mock-pool').default; + var MockAgent: typeof import('./mock-agent').default; + var mockErrors: typeof import('./mock-errors').default; + var fetch: typeof import('./fetch').fetch; + var caches: typeof import('./cache').caches; +} diff --git a/deps/undici/src/types/package.json b/deps/undici/src/types/package.json new file mode 100644 index 00000000000000..16bf97c4ddf83c --- /dev/null +++ b/deps/undici/src/types/package.json @@ -0,0 +1,55 @@ +{ + "name": "undici-types", + "version": "5.25.1", + "description": "A stand-alone types package for Undici", + "homepage": "https://undici.nodejs.org", + "bugs": { + "url": "https://github.com/nodejs/undici/issues" + }, + "repository": { + "type": "git", + "url": "git+https://github.com/nodejs/undici.git" + }, + "license": "MIT", + "types": "index.d.ts", + "files": [ + "*.d.ts" + ], + "contributors": [ + { + "name": "Daniele Belardi", + "url": "https://github.com/dnlup", + "author": true + }, + { + "name": "Ethan Arrowood", + "url": "https://github.com/ethan-arrowood", + "author": true + }, + { + "name": "Matteo Collina", + "url": "https://github.com/mcollina", + "author": true + }, + { + "name": "Matthew Aitken", + "url": "https://github.com/KhafraDev", + "author": true + }, + { + "name": "Robert Nagy", + "url": "https://github.com/ronag", + "author": true + }, + { + "name": "Szymon Marczak", + "url": "https://github.com/szmarczak", + "author": true + }, + { + "name": "Tomas Della Vedova", + "url": "https://github.com/delvedor", + "author": true + } + ] +} \ No newline at end of file diff --git a/deps/undici/undici.js b/deps/undici/undici.js index bea59e571c8772..cd6308f9f3cc2d 100644 --- a/deps/undici/undici.js +++ b/deps/undici/undici.js @@ -60,7 +60,13 @@ var require_symbols = __commonJS({ kProxy: Symbol("proxy agent options"), kCounter: Symbol("socket request counter"), kInterceptors: Symbol("dispatch interceptors"), - kMaxResponseSize: Symbol("max response size") + kMaxResponseSize: Symbol("max response size"), + kHTTP2Session: Symbol("http2Session"), + kHTTP2SessionState: Symbol("http2Session state"), + kHTTP2BuildRequest: Symbol("http2 build request"), + kHTTP1BuildRequest: Symbol("http1 build request"), + kHTTP2CopyHeaders: Symbol("http2 copy headers"), + kHTTPConnVersion: Symbol("http connection version") }; } }); @@ -292,7 +298,7 @@ var require_util = __commonJS({ var stream = require("stream"); var net = require("net"); var { InvalidArgumentError } = require_errors(); - var { Blob } = require("buffer"); + var { Blob: Blob2 } = require("buffer"); var nodeUtil = require("util"); var { stringify } = require("querystring"); var [nodeMajor, nodeMinor] = process.versions.node.split(".").map((v) => Number(v)); @@ -302,7 +308,7 @@ var require_util = __commonJS({ return obj && typeof obj === "object" && typeof obj.pipe === "function" && typeof obj.on === "function"; } function isBlobLike(object) { - return Blob && object instanceof Blob || object && typeof object === "object" && (typeof object.stream === "function" || typeof object.arrayBuffer === "function") && /^(Blob|File)$/.test(object[Symbol.toStringTag]); + return Blob2 && object instanceof Blob2 || object && typeof object === "object" && (typeof object.stream === "function" || typeof object.arrayBuffer === "function") && /^(Blob|File)$/.test(object[Symbol.toStringTag]); } function buildURL(url, queryParams) { if (url.includes("?") || url.includes("#")) { @@ -400,7 +406,7 @@ var require_util = __commonJS({ return 0; } else if (isStream(body)) { const state = body._readableState; - return state && state.ended === true && Number.isFinite(state.length) ? state.length : null; + return state && state.objectMode === false && state.ended === true && Number.isFinite(state.length) ? state.length : null; } else if (isBlobLike(body)) { return body.size != null ? body.size : null; } else if (isBuffer(body)) { @@ -439,6 +445,8 @@ var require_util = __commonJS({ return m ? parseInt(m[1], 10) * 1e3 : null; } function parseHeaders(headers, obj = {}) { + if (!Array.isArray(headers)) + return headers; for (let i = 0; i < headers.length; i += 2) { const key = headers[i].toString().toLowerCase(); let val = obj[key]; @@ -535,13 +543,18 @@ var require_util = __commonJS({ bytesRead: socket.bytesRead }; } + async function* convertIterableToBuffer(iterable) { + for await (const chunk of iterable) { + yield Buffer.isBuffer(chunk) ? chunk : Buffer.from(chunk); + } + } var ReadableStream; function ReadableStreamFrom(iterable) { if (!ReadableStream) { ReadableStream = require("stream/web").ReadableStream; } if (ReadableStream.from) { - return ReadableStream.from(iterable); + return ReadableStream.from(convertIterableToBuffer(iterable)); } let iterator; return new ReadableStream({ @@ -1139,11 +1152,24 @@ var require_util2 = __commonJS({ const metadata = list.filter((item) => item.algo === strongest); for (const item of metadata) { const algorithm = item.algo; - const expectedValue = item.hash; - const actualValue = crypto.createHash(algorithm).update(bytes).digest("base64"); + let expectedValue = item.hash; + if (expectedValue.endsWith("==")) { + expectedValue = expectedValue.slice(0, -2); + } + let actualValue = crypto.createHash(algorithm).update(bytes).digest("base64"); + if (actualValue.endsWith("==")) { + actualValue = actualValue.slice(0, -2); + } if (actualValue === expectedValue) { return true; } + let actualBase64URL = crypto.createHash(algorithm).update(bytes).digest("base64url"); + if (actualBase64URL.endsWith("==")) { + actualBase64URL = actualBase64URL.slice(0, -2); + } + if (actualBase64URL === expectedValue) { + return true; + } } return false; } @@ -1250,9 +1276,9 @@ var require_util2 = __commonJS({ } return { value: result, done: false }; } - function fullyReadBody(body, processBody, processBodyError) { - const successSteps = (bytes) => queueMicrotask(() => processBody(bytes)); - const errorSteps = (error) => queueMicrotask(() => processBodyError(error)); + async function fullyReadBody(body, processBody, processBodyError) { + const successSteps = processBody; + const errorSteps = processBodyError; let reader; try { reader = body.stream.getReader(); @@ -1260,7 +1286,12 @@ var require_util2 = __commonJS({ errorSteps(e); return; } - readAllBytes(reader, successSteps, errorSteps); + try { + const result = await readAllBytes(reader); + successSteps(result); + } catch (e) { + errorSteps(e); + } } var ReadableStream = globalThis.ReadableStream; function isReadableStreamLike(stream) { @@ -1291,25 +1322,16 @@ var require_util2 = __commonJS({ } return input; } - async function readAllBytes(reader, successSteps, failureSteps) { + async function readAllBytes(reader) { const bytes = []; let byteLength = 0; while (true) { - let done; - let chunk; - try { - ({ done, value: chunk } = await reader.read()); - } catch (e) { - failureSteps(e); - return; - } + const { done, value: chunk } = await reader.read(); if (done) { - successSteps(Buffer.concat(bytes, byteLength)); - return; + return Buffer.concat(bytes, byteLength); } if (!isUint8Array(chunk)) { - failureSteps(new TypeError("Received non-Uint8Array chunk")); - return; + throw new TypeError("Received non-Uint8Array chunk"); } bytes.push(chunk); byteLength += chunk.length; @@ -6015,14 +6037,14 @@ var require_dataURL = __commonJS({ var require_file = __commonJS({ "lib/fetch/file.js"(exports2, module2) { "use strict"; - var { Blob, File: NativeFile } = require("buffer"); + var { Blob: Blob2, File: NativeFile } = require("buffer"); var { types } = require("util"); var { kState } = require_symbols2(); var { isBlobLike } = require_util2(); var { webidl } = require_webidl(); var { parseMIMEType, serializeAMimeType } = require_dataURL(); var { kEnumerableProperty } = require_util(); - var File = class extends Blob { + var File = class extends Blob2 { constructor(fileBits, fileName, options = {}) { webidl.argumentLengthCheck(arguments, 2, { header: "File constructor" }); fileBits = webidl.converters["sequence"](fileBits); @@ -6118,7 +6140,7 @@ var require_file = __commonJS({ name: kEnumerableProperty, lastModified: kEnumerableProperty }); - webidl.converters.Blob = webidl.interfaceConverter(Blob); + webidl.converters.Blob = webidl.interfaceConverter(Blob2); webidl.converters.BlobPart = function(V, opts) { if (webidl.util.Type(V) === "Object") { if (isBlobLike(V)) { @@ -6200,7 +6222,7 @@ var require_formdata = __commonJS({ var { kState } = require_symbols2(); var { File: UndiciFile, FileLike, isFileLike } = require_file(); var { webidl } = require_webidl(); - var { Blob, File: NativeFile } = require("buffer"); + var { Blob: Blob2, File: NativeFile } = require("buffer"); var File = NativeFile ?? UndiciFile; var FormData = class { constructor(form) { @@ -6310,7 +6332,7 @@ var require_formdata = __commonJS({ value = Buffer.from(value).toString("utf8"); } else { if (!isFileLike(value)) { - value = value instanceof Blob ? new File([value], "blob", { type: value.type }) : new FileLike(value, "blob", { type: value.type }); + value = value instanceof Blob2 ? new File([value], "blob", { type: value.type }) : new FileLike(value, "blob", { type: value.type }); } if (filename !== void 0) { const options = { @@ -6344,7 +6366,7 @@ var require_body = __commonJS({ var { kState } = require_symbols2(); var { webidl } = require_webidl(); var { DOMException, structuredClone } = require_constants(); - var { Blob, File: NativeFile } = require("buffer"); + var { Blob: Blob2, File: NativeFile } = require("buffer"); var { kBodyUsed } = require_symbols(); var assert = require("assert"); var { isErrored } = require_util(); @@ -6536,7 +6558,7 @@ Content-Type: ${value.type || "application/octet-stream"}\r } else if (mimeType) { mimeType = serializeAMimeType(mimeType); } - return new Blob([bytes], { type: mimeType }); + return new Blob2([bytes], { type: mimeType }); }, instance); }, arrayBuffer() { @@ -6563,6 +6585,7 @@ Content-Type: ${value.type || "application/octet-stream"}\r try { busboy = Busboy({ headers, + preservePath: true, defParamCharset: "utf8" }); } catch (err) { @@ -6660,7 +6683,7 @@ Content-Type: ${value.type || "application/octet-stream"}\r successSteps(new Uint8Array()); return promise.promise; } - fullyReadBody(object[kState].body, successSteps, errorSteps); + await fullyReadBody(object[kState].body, successSteps, errorSteps); return promise.promise; } function bodyUnusable(body) { @@ -6738,7 +6761,7 @@ var require_response = __commonJS({ responseObject[kHeaders][kRealm] = relevantRealm; return responseObject; } - static json(data = void 0, init = {}) { + static json(data, init = {}) { webidl.argumentLengthCheck(arguments, 1, { header: "Response.json" }); if (init !== null) { init = webidl.converters.ResponseInit(init); @@ -6959,9 +6982,9 @@ var require_response = __commonJS({ assert(false); } } - function makeAppropriateNetworkError(fetchParams) { + function makeAppropriateNetworkError(fetchParams, err = null) { assert(isCancelled(fetchParams)); - return isAborted(fetchParams) ? makeNetworkError(new DOMException("The operation was aborted.", "AbortError")) : makeNetworkError("Request was cancelled."); + return isAborted(fetchParams) ? makeNetworkError(Object.assign(new DOMException("The operation was aborted.", "AbortError"), { cause: err })) : makeNetworkError(Object.assign(new DOMException("Request was cancelled."), { cause: err })); } function initializeResponse(response, init, body) { if (init.status !== null && (init.status < 200 || init.status > 599)) { @@ -8140,6 +8163,7 @@ var require_request2 = __commonJS({ NotSupportedError } = require_errors(); var assert = require("assert"); + var { kHTTP2BuildRequest, kHTTP2CopyHeaders, kHTTP1BuildRequest } = require_symbols(); var util = require_util(); var tokenRegExp = /^[\^_`a-zA-Z\-0-9!#$%&'*+.|~]+$/; var headerCharRegex = /[^\t\x20-\x7e\x80-\xff]/; @@ -8174,7 +8198,8 @@ var require_request2 = __commonJS({ headersTimeout, bodyTimeout, reset, - throwOnError + throwOnError, + expectContinue }, handler) { if (typeof path !== "string") { throw new InvalidArgumentError("path must be a string"); @@ -8200,6 +8225,9 @@ var require_request2 = __commonJS({ if (reset != null && typeof reset !== "boolean") { throw new InvalidArgumentError("invalid reset"); } + if (expectContinue != null && typeof expectContinue !== "boolean") { + throw new InvalidArgumentError("invalid expectContinue"); + } this.headersTimeout = headersTimeout; this.bodyTimeout = bodyTimeout; this.throwOnError = throwOnError === true; @@ -8233,6 +8261,7 @@ var require_request2 = __commonJS({ this.contentLength = null; this.contentType = null; this.headers = ""; + this.expectContinue = expectContinue != null ? expectContinue : false; if (Array.isArray(headers)) { if (headers.length % 2 !== 0) { throw new InvalidArgumentError("headers array must be even"); @@ -8335,8 +8364,48 @@ var require_request2 = __commonJS({ processHeader(this, key, value); return this; } + static [kHTTP1BuildRequest](origin, opts, handler) { + return new Request(origin, opts, handler); + } + static [kHTTP2BuildRequest](origin, opts, handler) { + const headers = opts.headers; + opts = { ...opts, headers: null }; + const request = new Request(origin, opts, handler); + request.headers = {}; + if (Array.isArray(headers)) { + if (headers.length % 2 !== 0) { + throw new InvalidArgumentError("headers array must be even"); + } + for (let i = 0; i < headers.length; i += 2) { + processHeader(request, headers[i], headers[i + 1], true); + } + } else if (headers && typeof headers === "object") { + const keys = Object.keys(headers); + for (let i = 0; i < keys.length; i++) { + const key = keys[i]; + processHeader(request, key, headers[key], true); + } + } else if (headers != null) { + throw new InvalidArgumentError("headers must be an object or an array"); + } + return request; + } + static [kHTTP2CopyHeaders](raw) { + const rawHeaders = raw.split("\r\n"); + const headers = {}; + for (const header of rawHeaders) { + const [key, value] = header.split(": "); + if (value == null || value.length === 0) + continue; + if (headers[key]) + headers[key] += `,${value}`; + else + headers[key] = value; + } + return headers; + } }; - function processHeaderValue(key, val) { + function processHeaderValue(key, val, skipAppend) { if (val && typeof val === "object") { throw new InvalidArgumentError(`invalid ${key} header`); } @@ -8344,10 +8413,10 @@ var require_request2 = __commonJS({ if (headerCharRegex.exec(val) !== null) { throw new InvalidArgumentError(`invalid ${key} header`); } - return `${key}: ${val}\r + return skipAppend ? val : `${key}: ${val}\r `; } - function processHeader(request, key, val) { + function processHeader(request, key, val, skipAppend = false) { if (val && (typeof val === "object" && !Array.isArray(val))) { throw new InvalidArgumentError(`invalid ${key} header`); } else if (val === void 0) { @@ -8386,10 +8455,20 @@ var require_request2 = __commonJS({ } else { if (Array.isArray(val)) { for (let i = 0; i < val.length; i++) { - request.headers += processHeaderValue(key, val[i]); + if (skipAppend) { + if (request.headers[key]) + request.headers[key] += `,${processHeaderValue(key, val[i], skipAppend)}`; + else + request.headers[key] = processHeaderValue(key, val[i], skipAppend); + } else { + request.headers += processHeaderValue(key, val[i]); + } } } else { - request.headers += processHeaderValue(key, val); + if (skipAppend) + request.headers[key] = processHeaderValue(key, val, skipAppend); + else + request.headers += processHeaderValue(key, val); } } } @@ -8455,13 +8534,14 @@ var require_connect = __commonJS({ } }; } - function buildConnector({ maxCachedSessions, socketPath, timeout, ...opts }) { + function buildConnector({ allowH2, maxCachedSessions, socketPath, timeout, ...opts }) { if (maxCachedSessions != null && (!Number.isInteger(maxCachedSessions) || maxCachedSessions < 0)) { throw new InvalidArgumentError("maxCachedSessions must be a positive integer or zero"); } const options = { path: socketPath, ...opts }; const sessionCache = new SessionCache(maxCachedSessions == null ? 100 : maxCachedSessions); timeout = timeout == null ? 1e4 : timeout; + allowH2 = allowH2 != null ? allowH2 : false; return function connect({ hostname, host, protocol, port, servername, localAddress, httpSocket }, callback) { let socket; if (protocol === "https:") { @@ -8478,6 +8558,7 @@ var require_connect = __commonJS({ servername, session, localAddress, + ALPNProtocols: allowH2 ? ["http/1.1", "h2"] : ["http/1.1"], socket: httpSocket, port: port || 443, host: hostname @@ -9068,6 +9149,7 @@ var require_client = __commonJS({ "use strict"; var assert = require("assert"); var net = require("net"); + var { pipeline } = require("stream"); var util = require_util(); var timers = require_timers(); var Request = require_request2(); @@ -9129,8 +9211,32 @@ var require_client = __commonJS({ kDispatch, kInterceptors, kLocalAddress, - kMaxResponseSize + kMaxResponseSize, + kHTTPConnVersion, + kHost, + kHTTP2Session, + kHTTP2SessionState, + kHTTP2BuildRequest, + kHTTP2CopyHeaders, + kHTTP1BuildRequest } = require_symbols(); + var http2; + try { + http2 = require("http2"); + } catch { + http2 = { constants: {} }; + } + var { + constants: { + HTTP2_HEADER_AUTHORITY, + HTTP2_HEADER_METHOD, + HTTP2_HEADER_PATH, + HTTP2_HEADER_CONTENT_LENGTH, + HTTP2_HEADER_EXPECT, + HTTP2_HEADER_STATUS + } + } = http2; + var h2ExperimentalWarned = false; var FastBuffer = Buffer[Symbol.species]; var kClosedResolve = Symbol("kClosedResolve"); var channels = {}; @@ -9172,7 +9278,9 @@ var require_client = __commonJS({ localAddress, maxResponseSize, autoSelectFamily, - autoSelectFamilyAttemptTimeout + autoSelectFamilyAttemptTimeout, + allowH2, + maxConcurrentStreams } = {}) { super(); if (keepAlive !== void 0) { @@ -9232,10 +9340,17 @@ var require_client = __commonJS({ if (autoSelectFamilyAttemptTimeout != null && (!Number.isInteger(autoSelectFamilyAttemptTimeout) || autoSelectFamilyAttemptTimeout < -1)) { throw new InvalidArgumentError("autoSelectFamilyAttemptTimeout must be a positive number"); } + if (allowH2 != null && typeof allowH2 !== "boolean") { + throw new InvalidArgumentError("allowH2 must be a valid boolean value"); + } + if (maxConcurrentStreams != null && (typeof maxConcurrentStreams !== "number" || maxConcurrentStreams < 1)) { + throw new InvalidArgumentError("maxConcurrentStreams must be a possitive integer, greater than 0"); + } if (typeof connect2 !== "function") { connect2 = buildConnector({ ...tls, maxCachedSessions, + allowH2, socketPath, timeout: connectTimeout, ...util.nodeHasAutoSelectFamily && autoSelectFamily ? { autoSelectFamily, autoSelectFamilyAttemptTimeout } : void 0, @@ -9265,6 +9380,13 @@ var require_client = __commonJS({ this[kMaxRequests] = maxRequestsPerClient; this[kClosedResolve] = null; this[kMaxResponseSize] = maxResponseSize > -1 ? maxResponseSize : -1; + this[kHTTPConnVersion] = "h1"; + this[kHTTP2Session] = null; + this[kHTTP2SessionState] = !allowH2 ? null : { + openStreams: 0, + maxConcurrentStreams: maxConcurrentStreams != null ? maxConcurrentStreams : 100 + }; + this[kHost] = `${this[kUrl].hostname}${this[kUrl].port ? `:${this[kUrl].port}` : ""}`; this[kQueue] = []; this[kRunningIdx] = 0; this[kPendingIdx] = 0; @@ -9298,7 +9420,7 @@ var require_client = __commonJS({ } [kDispatch](opts, handler) { const origin = opts.origin || this[kUrl].origin; - const request = new Request(origin, opts, handler); + const request = this[kHTTPConnVersion] === "h2" ? Request[kHTTP2BuildRequest](origin, opts, handler) : Request[kHTTP1BuildRequest](origin, opts, handler); this[kQueue].push(request); if (this[kResuming]) { } else if (util.bodyLength(request.body) == null && util.isIterable(request.body)) { @@ -9335,6 +9457,11 @@ var require_client = __commonJS({ } resolve(); }; + if (this[kHTTP2Session] != null) { + util.destroy(this[kHTTP2Session], err); + this[kHTTP2Session] = null; + this[kHTTP2SessionState] = null; + } if (!this[kSocket]) { queueMicrotask(callback); } else { @@ -9344,6 +9471,44 @@ var require_client = __commonJS({ }); } }; + function onHttp2SessionError(err) { + assert(err.code !== "ERR_TLS_CERT_ALTNAME_INVALID"); + this[kSocket][kError] = err; + onError(this[kClient], err); + } + function onHttp2FrameError(type, code, id) { + const err = new InformationalError(`HTTP/2: "frameError" received - type ${type}, code ${code}`); + if (id === 0) { + this[kSocket][kError] = err; + onError(this[kClient], err); + } + } + function onHttp2SessionEnd() { + util.destroy(this, new SocketError("other side closed")); + util.destroy(this[kSocket], new SocketError("other side closed")); + } + function onHTTP2GoAway(code) { + const client = this[kClient]; + const err = new InformationalError(`HTTP/2: "GOAWAY" frame received with code ${code}`); + client[kSocket] = null; + client[kHTTP2Session] = null; + if (client.destroyed) { + assert(this[kPending] === 0); + const requests = client[kQueue].splice(client[kRunningIdx]); + for (let i = 0; i < requests.length; i++) { + const request = requests[i]; + errorRequest(this, request, err); + } + } else if (client[kRunning] > 0) { + const request = client[kQueue][client[kRunningIdx]]; + client[kQueue][client[kRunningIdx]++] = null; + errorRequest(client, request, err); + } + client[kPendingIdx] = client[kRunningIdx]; + assert(client[kRunning] === 0); + client.emit("disconnect", client[kUrl], [client], err); + resume(client); + } var constants = require_constants2(); var createRedirectInterceptor = require_redirectInterceptor(); var EMPTY_BUF = Buffer.alloc(0); @@ -9783,11 +9948,13 @@ var require_client = __commonJS({ parser.readMore(); } function onSocketError(err) { - const { [kParser]: parser } = this; + const { [kClient]: client, [kParser]: parser } = this; assert(err.code !== "ERR_TLS_CERT_ALTNAME_INVALID"); - if (err.code === "ECONNRESET" && parser.statusCode && !parser.shouldKeepAlive) { - parser.onMessageComplete(); - return; + if (client[kHTTPConnVersion] !== "h2") { + if (err.code === "ECONNRESET" && parser.statusCode && !parser.shouldKeepAlive) { + parser.onMessageComplete(); + return; + } } this[kError] = err; onError(this[kClient], err); @@ -9804,20 +9971,24 @@ var require_client = __commonJS({ } } function onSocketEnd() { - const { [kParser]: parser } = this; - if (parser.statusCode && !parser.shouldKeepAlive) { - parser.onMessageComplete(); - return; + const { [kParser]: parser, [kClient]: client } = this; + if (client[kHTTPConnVersion] !== "h2") { + if (parser.statusCode && !parser.shouldKeepAlive) { + parser.onMessageComplete(); + return; + } } util.destroy(this, new SocketError("other side closed", util.getSocketInfo(this))); } function onSocketClose() { - const { [kClient]: client } = this; - if (!this[kError] && this[kParser].statusCode && !this[kParser].shouldKeepAlive) { - this[kParser].onMessageComplete(); + const { [kClient]: client, [kParser]: parser } = this; + if (client[kHTTPConnVersion] === "h1" && parser) { + if (!this[kError] && parser.statusCode && !parser.shouldKeepAlive) { + parser.onMessageComplete(); + } + this[kParser].destroy(); + this[kParser] = null; } - this[kParser].destroy(); - this[kParser] = null; const err = this[kError] || new SocketError("closed", util.getSocketInfo(this)); client[kSocket] = null; if (client.destroyed) { @@ -9884,21 +10055,46 @@ var require_client = __commonJS({ }), new ClientDestroyedError()); return; } - if (!llhttpInstance) { - llhttpInstance = await llhttpPromise; - llhttpPromise = null; - } client[kConnecting] = false; assert(socket); - socket[kNoRef] = false; - socket[kWriting] = false; - socket[kReset] = false; - socket[kBlocking] = false; - socket[kError] = null; - socket[kParser] = new Parser(client, socket, llhttpInstance); - socket[kClient] = client; + const isH2 = socket.alpnProtocol === "h2"; + if (isH2) { + if (!h2ExperimentalWarned) { + h2ExperimentalWarned = true; + process.emitWarning("H2 support is experimental, expect them to change at any time.", { + code: "UNDICI-H2" + }); + } + const session = http2.connect(client[kUrl], { + createConnection: () => socket, + peerMaxConcurrentStreams: client[kHTTP2SessionState].maxConcurrentStreams + }); + client[kHTTPConnVersion] = "h2"; + session[kClient] = client; + session[kSocket] = socket; + session.on("error", onHttp2SessionError); + session.on("frameError", onHttp2FrameError); + session.on("end", onHttp2SessionEnd); + session.on("goaway", onHTTP2GoAway); + session.on("close", onSocketClose); + session.unref(); + client[kHTTP2Session] = session; + socket[kHTTP2Session] = session; + } else { + if (!llhttpInstance) { + llhttpInstance = await llhttpPromise; + llhttpPromise = null; + } + socket[kNoRef] = false; + socket[kWriting] = false; + socket[kReset] = false; + socket[kBlocking] = false; + socket[kParser] = new Parser(client, socket, llhttpInstance); + } socket[kCounter] = 0; socket[kMaxRequests] = client[kMaxRequests]; + socket[kClient] = client; + socket[kError] = null; socket.on("error", onSocketError).on("readable", onSocketReadable).on("end", onSocketEnd).on("close", onSocketClose); client[kSocket] = socket; if (channels.connected.hasSubscribers) { @@ -9977,7 +10173,7 @@ var require_client = __commonJS({ return; } const socket = client[kSocket]; - if (socket && !socket.destroyed) { + if (socket && !socket.destroyed && socket.alpnProtocol !== "h2") { if (client[kSize] === 0) { if (!socket[kNoRef] && socket.unref) { socket.unref(); @@ -10030,7 +10226,7 @@ var require_client = __commonJS({ if (client[kConnecting]) { return; } - if (!socket) { + if (!socket && !client[kHTTP2Session]) { connect(client); return; } @@ -10064,6 +10260,10 @@ var require_client = __commonJS({ } } function write(client, request) { + if (client[kHTTPConnVersion] === "h2") { + writeH2(client, client[kHTTP2Session], request); + return; + } const { body, method, path, host, upgrade, headers, blocking, reset } = request; const expectsPayload = method === "PUT" || method === "POST" || method === "PATCH"; if (body && typeof body.read === "function") { @@ -10175,8 +10375,205 @@ upgrade: ${upgrade}\r } return true; } - function writeStream({ body, client, request, socket, contentLength, header, expectsPayload }) { + function writeH2(client, session, request) { + const { body, method, path, host, upgrade, expectContinue, signal, headers: reqHeaders } = request; + let headers; + if (typeof reqHeaders === "string") + headers = Request[kHTTP2CopyHeaders](reqHeaders.trim()); + else + headers = reqHeaders; + if (upgrade) { + errorRequest(client, request, new Error("Upgrade not supported for H2")); + return false; + } + try { + request.onConnect((err) => { + if (request.aborted || request.completed) { + return; + } + errorRequest(client, request, err || new RequestAbortedError()); + }); + } catch (err) { + errorRequest(client, request, err); + } + if (request.aborted) { + return false; + } + let stream; + const h2State = client[kHTTP2SessionState]; + headers[HTTP2_HEADER_AUTHORITY] = host || client[kHost]; + headers[HTTP2_HEADER_PATH] = path; + if (method === "CONNECT") { + session.ref(); + stream = session.request(headers, { endStream: false, signal }); + if (stream.id && !stream.pending) { + request.onUpgrade(null, null, stream); + ++h2State.openStreams; + } else { + stream.once("ready", () => { + request.onUpgrade(null, null, stream); + ++h2State.openStreams; + }); + } + stream.once("close", () => { + h2State.openStreams -= 1; + if (h2State.openStreams === 0) + session.unref(); + }); + return true; + } else { + headers[HTTP2_HEADER_METHOD] = method; + } + const expectsPayload = method === "PUT" || method === "POST" || method === "PATCH"; + if (body && typeof body.read === "function") { + body.read(0); + } + let contentLength = util.bodyLength(body); + if (contentLength == null) { + contentLength = request.contentLength; + } + if (contentLength === 0 || !expectsPayload) { + contentLength = null; + } + if (request.contentLength != null && request.contentLength !== contentLength) { + if (client[kStrictContentLength]) { + errorRequest(client, request, new RequestContentLengthMismatchError()); + return false; + } + process.emitWarning(new RequestContentLengthMismatchError()); + } + if (contentLength != null) { + assert(body, "no body must not have content length"); + headers[HTTP2_HEADER_CONTENT_LENGTH] = `${contentLength}`; + } + session.ref(); + const shouldEndStream = method === "GET" || method === "HEAD"; + if (expectContinue) { + headers[HTTP2_HEADER_EXPECT] = "100-continue"; + stream = session.request(headers, { endStream: shouldEndStream, signal }); + stream.once("continue", writeBodyH2); + } else { + stream = session.request(headers, { + endStream: shouldEndStream, + signal + }); + writeBodyH2(); + } + ++h2State.openStreams; + stream.once("response", (headers2) => { + if (request.onHeaders(Number(headers2[HTTP2_HEADER_STATUS]), headers2, stream.resume.bind(stream), "") === false) { + stream.pause(); + } + }); + stream.once("end", () => { + request.onComplete([]); + }); + stream.on("data", (chunk) => { + if (request.onData(chunk) === false) + stream.pause(); + }); + stream.once("close", () => { + h2State.openStreams -= 1; + if (h2State.openStreams === 0) + session.unref(); + }); + stream.once("error", function(err) { + if (client[kHTTP2Session] && !client[kHTTP2Session].destroyed && !this.closed && !this.destroyed) { + h2State.streams -= 1; + util.destroy(stream, err); + } + }); + stream.once("frameError", (type, code) => { + const err = new InformationalError(`HTTP/2: "frameError" received - type ${type}, code ${code}`); + errorRequest(client, request, err); + if (client[kHTTP2Session] && !client[kHTTP2Session].destroyed && !this.closed && !this.destroyed) { + h2State.streams -= 1; + util.destroy(stream, err); + } + }); + return true; + function writeBodyH2() { + if (!body) { + request.onRequestSent(); + } else if (util.isBuffer(body)) { + assert(contentLength === body.byteLength, "buffer body must have content length"); + stream.cork(); + stream.write(body); + stream.uncork(); + request.onBodySent(body); + request.onRequestSent(); + } else if (util.isBlobLike(body)) { + if (typeof body.stream === "function") { + writeIterable({ + client, + request, + contentLength, + h2stream: stream, + expectsPayload, + body: body.stream(), + socket: client[kSocket], + header: "" + }); + } else { + writeBlob({ + body, + client, + request, + contentLength, + expectsPayload, + h2stream: stream, + header: "", + socket: client[kSocket] + }); + } + } else if (util.isStream(body)) { + writeStream({ + body, + client, + request, + contentLength, + expectsPayload, + socket: client[kSocket], + h2stream: stream, + header: "" + }); + } else if (util.isIterable(body)) { + writeIterable({ + body, + client, + request, + contentLength, + expectsPayload, + header: "", + h2stream: stream, + socket: client[kSocket] + }); + } else { + assert(false); + } + } + } + function writeStream({ h2stream, body, client, request, socket, contentLength, header, expectsPayload }) { assert(contentLength !== 0 || client[kRunning] === 0, "stream body cannot be pipelined"); + if (client[kHTTPConnVersion] === "h2") { + let onPipeData = function(chunk) { + request.onBodySent(chunk); + }; + const pipe = pipeline(body, h2stream, (err) => { + if (err) { + util.destroy(body, err); + util.destroy(h2stream, err); + } else { + request.onRequestSent(); + } + }); + pipe.on("data", onPipeData); + pipe.once("end", () => { + pipe.removeListener("data", onPipeData); + util.destroy(pipe); + }); + return; + } let finished = false; const writer = new AsyncWriter({ socket, request, contentLength, client, expectsPayload, header }); const onData = function(chunk) { @@ -10230,19 +10627,26 @@ upgrade: ${upgrade}\r } socket.on("drain", onDrain).on("error", onFinished); } - async function writeBlob({ body, client, request, socket, contentLength, header, expectsPayload }) { + async function writeBlob({ h2stream, body, client, request, socket, contentLength, header, expectsPayload }) { assert(contentLength === body.size, "blob body must have content length"); + const isH2 = client[kHTTPConnVersion] === "h2"; try { if (contentLength != null && contentLength !== body.size) { throw new RequestContentLengthMismatchError(); } const buffer = Buffer.from(await body.arrayBuffer()); - socket.cork(); - socket.write(`${header}content-length: ${contentLength}\r + if (isH2) { + h2stream.cork(); + h2stream.write(buffer); + h2stream.uncork(); + } else { + socket.cork(); + socket.write(`${header}content-length: ${contentLength}\r \r `, "latin1"); - socket.write(buffer); - socket.uncork(); + socket.write(buffer); + socket.uncork(); + } request.onBodySent(buffer); request.onRequestSent(); if (!expectsPayload) { @@ -10250,10 +10654,10 @@ upgrade: ${upgrade}\r } resume(client); } catch (err) { - util.destroy(socket, err); + util.destroy(isH2 ? h2stream : socket, err); } } - async function writeIterable({ body, client, request, socket, contentLength, header, expectsPayload }) { + async function writeIterable({ h2stream, body, client, request, socket, contentLength, header, expectsPayload }) { assert(contentLength !== 0 || client[kRunning] === 0, "iterator body cannot be pipelined"); let callback = null; function onDrain() { @@ -10271,6 +10675,24 @@ upgrade: ${upgrade}\r callback = resolve; } }); + if (client[kHTTPConnVersion] === "h2") { + h2stream.on("close", onDrain).on("drain", onDrain); + try { + for await (const chunk of body) { + if (socket[kError]) { + throw socket[kError]; + } + if (!h2stream.write(chunk)) { + await waitForDrain(); + } + } + } catch (err) { + h2stream.destroy(err); + } finally { + h2stream.off("close", onDrain).off("drain", onDrain); + } + return; + } socket.on("close", onDrain).on("drain", onDrain); const writer = new AsyncWriter({ socket, request, contentLength, client, expectsPayload, header }); try { @@ -10442,6 +10864,7 @@ var require_pool = __commonJS({ socketPath, autoSelectFamily, autoSelectFamilyAttemptTimeout, + allowH2, ...options } = {}) { super(); @@ -10458,6 +10881,7 @@ var require_pool = __commonJS({ connect = buildConnector({ ...tls, maxCachedSessions, + allowH2, socketPath, timeout: connectTimeout == null ? 1e4 : connectTimeout, ...util.nodeHasAutoSelectFamily && autoSelectFamily ? { autoSelectFamily, autoSelectFamilyAttemptTimeout } : void 0, @@ -10467,7 +10891,7 @@ var require_pool = __commonJS({ this[kInterceptors] = options.interceptors && options.interceptors.Pool && Array.isArray(options.interceptors.Pool) ? options.interceptors.Pool : []; this[kConnections] = connections || null; this[kUrl] = util.parseOrigin(origin); - this[kOptions] = { ...util.deepClone(options), connect }; + this[kOptions] = { ...util.deepClone(options), connect, allowH2 }; this[kOptions].interceptors = options.interceptors ? { ...options.interceptors } : void 0; this[kFactory] = factory; } @@ -11383,7 +11807,7 @@ var require_fetch = __commonJS({ } catch (err) { if (err.name === "AbortError") { fetchParams.controller.connection.destroy(); - return makeAppropriateNetworkError(fetchParams); + return makeAppropriateNetworkError(fetchParams, err); } return makeNetworkError(err); } @@ -11498,15 +11922,28 @@ var require_fetch = __commonJS({ let codings = []; let location = ""; const headers = new Headers(); - for (let n = 0; n < headersList.length; n += 2) { - const key = headersList[n + 0].toString("latin1"); - const val = headersList[n + 1].toString("latin1"); - if (key.toLowerCase() === "content-encoding") { - codings = val.toLowerCase().split(",").map((x) => x.trim()).reverse(); - } else if (key.toLowerCase() === "location") { - location = val; + if (Array.isArray(headersList)) { + for (let n = 0; n < headersList.length; n += 2) { + const key = headersList[n + 0].toString("latin1"); + const val = headersList[n + 1].toString("latin1"); + if (key.toLowerCase() === "content-encoding") { + codings = val.toLowerCase().split(",").map((x) => x.trim()); + } else if (key.toLowerCase() === "location") { + location = val; + } + headers.append(key, val); + } + } else { + const keys = Object.keys(headersList); + for (const key of keys) { + const val = headersList[key]; + if (key.toLowerCase() === "content-encoding") { + codings = val.toLowerCase().split(",").map((x) => x.trim()).reverse(); + } else if (key.toLowerCase() === "location") { + location = val; + } + headers.append(key, val); } - headers.append(key, val); } this.body = new Readable({ read: resume }); const decoders = []; @@ -11591,11 +12028,1194 @@ var require_fetch = __commonJS({ } }); +// lib/websocket/constants.js +var require_constants3 = __commonJS({ + "lib/websocket/constants.js"(exports2, module2) { + "use strict"; + var uid = "258EAFA5-E914-47DA-95CA-C5AB0DC85B11"; + var staticPropertyDescriptors = { + enumerable: true, + writable: false, + configurable: false + }; + var states = { + CONNECTING: 0, + OPEN: 1, + CLOSING: 2, + CLOSED: 3 + }; + var opcodes = { + CONTINUATION: 0, + TEXT: 1, + BINARY: 2, + CLOSE: 8, + PING: 9, + PONG: 10 + }; + var maxUnsigned16Bit = 2 ** 16 - 1; + var parserStates = { + INFO: 0, + PAYLOADLENGTH_16: 2, + PAYLOADLENGTH_64: 3, + READ_DATA: 4 + }; + var emptyBuffer = Buffer.allocUnsafe(0); + module2.exports = { + uid, + staticPropertyDescriptors, + states, + opcodes, + maxUnsigned16Bit, + parserStates, + emptyBuffer + }; + } +}); + +// lib/websocket/symbols.js +var require_symbols3 = __commonJS({ + "lib/websocket/symbols.js"(exports2, module2) { + "use strict"; + module2.exports = { + kWebSocketURL: Symbol("url"), + kReadyState: Symbol("ready state"), + kController: Symbol("controller"), + kResponse: Symbol("response"), + kBinaryType: Symbol("binary type"), + kSentClose: Symbol("sent close"), + kReceivedClose: Symbol("received close"), + kByteParser: Symbol("byte parser") + }; + } +}); + +// lib/websocket/events.js +var require_events = __commonJS({ + "lib/websocket/events.js"(exports2, module2) { + "use strict"; + var { webidl } = require_webidl(); + var { kEnumerableProperty } = require_util(); + var { MessagePort } = require("worker_threads"); + var MessageEvent = class extends Event { + #eventInit; + constructor(type, eventInitDict = {}) { + webidl.argumentLengthCheck(arguments, 1, { header: "MessageEvent constructor" }); + type = webidl.converters.DOMString(type); + eventInitDict = webidl.converters.MessageEventInit(eventInitDict); + super(type, eventInitDict); + this.#eventInit = eventInitDict; + } + get data() { + webidl.brandCheck(this, MessageEvent); + return this.#eventInit.data; + } + get origin() { + webidl.brandCheck(this, MessageEvent); + return this.#eventInit.origin; + } + get lastEventId() { + webidl.brandCheck(this, MessageEvent); + return this.#eventInit.lastEventId; + } + get source() { + webidl.brandCheck(this, MessageEvent); + return this.#eventInit.source; + } + get ports() { + webidl.brandCheck(this, MessageEvent); + if (!Object.isFrozen(this.#eventInit.ports)) { + Object.freeze(this.#eventInit.ports); + } + return this.#eventInit.ports; + } + initMessageEvent(type, bubbles = false, cancelable = false, data = null, origin = "", lastEventId = "", source = null, ports = []) { + webidl.brandCheck(this, MessageEvent); + webidl.argumentLengthCheck(arguments, 1, { header: "MessageEvent.initMessageEvent" }); + return new MessageEvent(type, { + bubbles, + cancelable, + data, + origin, + lastEventId, + source, + ports + }); + } + }; + var CloseEvent = class extends Event { + #eventInit; + constructor(type, eventInitDict = {}) { + webidl.argumentLengthCheck(arguments, 1, { header: "CloseEvent constructor" }); + type = webidl.converters.DOMString(type); + eventInitDict = webidl.converters.CloseEventInit(eventInitDict); + super(type, eventInitDict); + this.#eventInit = eventInitDict; + } + get wasClean() { + webidl.brandCheck(this, CloseEvent); + return this.#eventInit.wasClean; + } + get code() { + webidl.brandCheck(this, CloseEvent); + return this.#eventInit.code; + } + get reason() { + webidl.brandCheck(this, CloseEvent); + return this.#eventInit.reason; + } + }; + var ErrorEvent = class extends Event { + #eventInit; + constructor(type, eventInitDict) { + webidl.argumentLengthCheck(arguments, 1, { header: "ErrorEvent constructor" }); + super(type, eventInitDict); + type = webidl.converters.DOMString(type); + eventInitDict = webidl.converters.ErrorEventInit(eventInitDict ?? {}); + this.#eventInit = eventInitDict; + } + get message() { + webidl.brandCheck(this, ErrorEvent); + return this.#eventInit.message; + } + get filename() { + webidl.brandCheck(this, ErrorEvent); + return this.#eventInit.filename; + } + get lineno() { + webidl.brandCheck(this, ErrorEvent); + return this.#eventInit.lineno; + } + get colno() { + webidl.brandCheck(this, ErrorEvent); + return this.#eventInit.colno; + } + get error() { + webidl.brandCheck(this, ErrorEvent); + return this.#eventInit.error; + } + }; + Object.defineProperties(MessageEvent.prototype, { + [Symbol.toStringTag]: { + value: "MessageEvent", + configurable: true + }, + data: kEnumerableProperty, + origin: kEnumerableProperty, + lastEventId: kEnumerableProperty, + source: kEnumerableProperty, + ports: kEnumerableProperty, + initMessageEvent: kEnumerableProperty + }); + Object.defineProperties(CloseEvent.prototype, { + [Symbol.toStringTag]: { + value: "CloseEvent", + configurable: true + }, + reason: kEnumerableProperty, + code: kEnumerableProperty, + wasClean: kEnumerableProperty + }); + Object.defineProperties(ErrorEvent.prototype, { + [Symbol.toStringTag]: { + value: "ErrorEvent", + configurable: true + }, + message: kEnumerableProperty, + filename: kEnumerableProperty, + lineno: kEnumerableProperty, + colno: kEnumerableProperty, + error: kEnumerableProperty + }); + webidl.converters.MessagePort = webidl.interfaceConverter(MessagePort); + webidl.converters["sequence"] = webidl.sequenceConverter(webidl.converters.MessagePort); + var eventInit = [ + { + key: "bubbles", + converter: webidl.converters.boolean, + defaultValue: false + }, + { + key: "cancelable", + converter: webidl.converters.boolean, + defaultValue: false + }, + { + key: "composed", + converter: webidl.converters.boolean, + defaultValue: false + } + ]; + webidl.converters.MessageEventInit = webidl.dictionaryConverter([ + ...eventInit, + { + key: "data", + converter: webidl.converters.any, + defaultValue: null + }, + { + key: "origin", + converter: webidl.converters.USVString, + defaultValue: "" + }, + { + key: "lastEventId", + converter: webidl.converters.DOMString, + defaultValue: "" + }, + { + key: "source", + converter: webidl.nullableConverter(webidl.converters.MessagePort), + defaultValue: null + }, + { + key: "ports", + converter: webidl.converters["sequence"], + get defaultValue() { + return []; + } + } + ]); + webidl.converters.CloseEventInit = webidl.dictionaryConverter([ + ...eventInit, + { + key: "wasClean", + converter: webidl.converters.boolean, + defaultValue: false + }, + { + key: "code", + converter: webidl.converters["unsigned short"], + defaultValue: 0 + }, + { + key: "reason", + converter: webidl.converters.USVString, + defaultValue: "" + } + ]); + webidl.converters.ErrorEventInit = webidl.dictionaryConverter([ + ...eventInit, + { + key: "message", + converter: webidl.converters.DOMString, + defaultValue: "" + }, + { + key: "filename", + converter: webidl.converters.USVString, + defaultValue: "" + }, + { + key: "lineno", + converter: webidl.converters["unsigned long"], + defaultValue: 0 + }, + { + key: "colno", + converter: webidl.converters["unsigned long"], + defaultValue: 0 + }, + { + key: "error", + converter: webidl.converters.any + } + ]); + module2.exports = { + MessageEvent, + CloseEvent, + ErrorEvent + }; + } +}); + +// lib/websocket/util.js +var require_util3 = __commonJS({ + "lib/websocket/util.js"(exports2, module2) { + "use strict"; + var { kReadyState, kController, kResponse, kBinaryType, kWebSocketURL } = require_symbols3(); + var { states, opcodes } = require_constants3(); + var { MessageEvent, ErrorEvent } = require_events(); + function isEstablished(ws) { + return ws[kReadyState] === states.OPEN; + } + function isClosing(ws) { + return ws[kReadyState] === states.CLOSING; + } + function isClosed(ws) { + return ws[kReadyState] === states.CLOSED; + } + function fireEvent(e, target, eventConstructor = Event, eventInitDict) { + const event = new eventConstructor(e, eventInitDict); + target.dispatchEvent(event); + } + function websocketMessageReceived(ws, type, data) { + if (ws[kReadyState] !== states.OPEN) { + return; + } + let dataForEvent; + if (type === opcodes.TEXT) { + try { + dataForEvent = new TextDecoder("utf-8", { fatal: true }).decode(data); + } catch { + failWebsocketConnection(ws, "Received invalid UTF-8 in text frame."); + return; + } + } else if (type === opcodes.BINARY) { + if (ws[kBinaryType] === "blob") { + dataForEvent = new Blob([data]); + } else { + dataForEvent = new Uint8Array(data).buffer; + } + } + fireEvent("message", ws, MessageEvent, { + origin: ws[kWebSocketURL].origin, + data: dataForEvent + }); + } + function isValidSubprotocol(protocol) { + if (protocol.length === 0) { + return false; + } + for (const char of protocol) { + const code = char.charCodeAt(0); + if (code < 33 || code > 126 || char === "(" || char === ")" || char === "<" || char === ">" || char === "@" || char === "," || char === ";" || char === ":" || char === "\\" || char === '"' || char === "/" || char === "[" || char === "]" || char === "?" || char === "=" || char === "{" || char === "}" || code === 32 || code === 9) { + return false; + } + } + return true; + } + function isValidStatusCode(code) { + if (code >= 1e3 && code < 1015) { + return code !== 1004 && code !== 1005 && code !== 1006; + } + return code >= 3e3 && code <= 4999; + } + function failWebsocketConnection(ws, reason) { + const { [kController]: controller, [kResponse]: response } = ws; + controller.abort(); + if (response?.socket && !response.socket.destroyed) { + response.socket.destroy(); + } + if (reason) { + fireEvent("error", ws, ErrorEvent, { + error: new Error(reason) + }); + } + } + module2.exports = { + isEstablished, + isClosing, + isClosed, + fireEvent, + isValidSubprotocol, + isValidStatusCode, + failWebsocketConnection, + websocketMessageReceived + }; + } +}); + +// lib/websocket/connection.js +var require_connection = __commonJS({ + "lib/websocket/connection.js"(exports2, module2) { + "use strict"; + var diagnosticsChannel = require("diagnostics_channel"); + var { uid, states } = require_constants3(); + var { + kReadyState, + kSentClose, + kByteParser, + kReceivedClose + } = require_symbols3(); + var { fireEvent, failWebsocketConnection } = require_util3(); + var { CloseEvent } = require_events(); + var { makeRequest } = require_request(); + var { fetching } = require_fetch(); + var { Headers } = require_headers(); + var { getGlobalDispatcher } = require_global2(); + var { kHeadersList } = require_symbols(); + var channels = {}; + channels.open = diagnosticsChannel.channel("undici:websocket:open"); + channels.close = diagnosticsChannel.channel("undici:websocket:close"); + channels.socketError = diagnosticsChannel.channel("undici:websocket:socket_error"); + var crypto; + try { + crypto = require("crypto"); + } catch { + } + function establishWebSocketConnection(url, protocols, ws, onEstablish, options) { + const requestURL = url; + requestURL.protocol = url.protocol === "ws:" ? "http:" : "https:"; + const request = makeRequest({ + urlList: [requestURL], + serviceWorkers: "none", + referrer: "no-referrer", + mode: "websocket", + credentials: "include", + cache: "no-store", + redirect: "error" + }); + if (options.headers) { + const headersList = new Headers(options.headers)[kHeadersList]; + request.headersList = headersList; + } + const keyValue = crypto.randomBytes(16).toString("base64"); + request.headersList.append("sec-websocket-key", keyValue); + request.headersList.append("sec-websocket-version", "13"); + for (const protocol of protocols) { + request.headersList.append("sec-websocket-protocol", protocol); + } + const permessageDeflate = ""; + const controller = fetching({ + request, + useParallelQueue: true, + dispatcher: options.dispatcher ?? getGlobalDispatcher(), + processResponse(response) { + if (response.type === "error" || response.status !== 101) { + failWebsocketConnection(ws, "Received network error or non-101 status code."); + return; + } + if (protocols.length !== 0 && !response.headersList.get("Sec-WebSocket-Protocol")) { + failWebsocketConnection(ws, "Server did not respond with sent protocols."); + return; + } + if (response.headersList.get("Upgrade")?.toLowerCase() !== "websocket") { + failWebsocketConnection(ws, 'Server did not set Upgrade header to "websocket".'); + return; + } + if (response.headersList.get("Connection")?.toLowerCase() !== "upgrade") { + failWebsocketConnection(ws, 'Server did not set Connection header to "upgrade".'); + return; + } + const secWSAccept = response.headersList.get("Sec-WebSocket-Accept"); + const digest = crypto.createHash("sha1").update(keyValue + uid).digest("base64"); + if (secWSAccept !== digest) { + failWebsocketConnection(ws, "Incorrect hash received in Sec-WebSocket-Accept header."); + return; + } + const secExtension = response.headersList.get("Sec-WebSocket-Extensions"); + if (secExtension !== null && secExtension !== permessageDeflate) { + failWebsocketConnection(ws, "Received different permessage-deflate than the one set."); + return; + } + const secProtocol = response.headersList.get("Sec-WebSocket-Protocol"); + if (secProtocol !== null && secProtocol !== request.headersList.get("Sec-WebSocket-Protocol")) { + failWebsocketConnection(ws, "Protocol was not set in the opening handshake."); + return; + } + response.socket.on("data", onSocketData); + response.socket.on("close", onSocketClose); + response.socket.on("error", onSocketError); + if (channels.open.hasSubscribers) { + channels.open.publish({ + address: response.socket.address(), + protocol: secProtocol, + extensions: secExtension + }); + } + onEstablish(response); + } + }); + return controller; + } + function onSocketData(chunk) { + if (!this.ws[kByteParser].write(chunk)) { + this.pause(); + } + } + function onSocketClose() { + const { ws } = this; + const wasClean = ws[kSentClose] && ws[kReceivedClose]; + let code = 1005; + let reason = ""; + const result = ws[kByteParser].closingInfo; + if (result) { + code = result.code ?? 1005; + reason = result.reason; + } else if (!ws[kSentClose]) { + code = 1006; + } + ws[kReadyState] = states.CLOSED; + fireEvent("close", ws, CloseEvent, { + wasClean, + code, + reason + }); + if (channels.close.hasSubscribers) { + channels.close.publish({ + websocket: ws, + code, + reason + }); + } + } + function onSocketError(error) { + const { ws } = this; + ws[kReadyState] = states.CLOSING; + if (channels.socketError.hasSubscribers) { + channels.socketError.publish(error); + } + this.destroy(); + } + module2.exports = { + establishWebSocketConnection + }; + } +}); + +// lib/websocket/frame.js +var require_frame = __commonJS({ + "lib/websocket/frame.js"(exports2, module2) { + "use strict"; + var { maxUnsigned16Bit } = require_constants3(); + var crypto; + try { + crypto = require("crypto"); + } catch { + } + var WebsocketFrameSend = class { + constructor(data) { + this.frameData = data; + this.maskKey = crypto.randomBytes(4); + } + createFrame(opcode) { + const bodyLength = this.frameData?.byteLength ?? 0; + let payloadLength = bodyLength; + let offset = 6; + if (bodyLength > maxUnsigned16Bit) { + offset += 8; + payloadLength = 127; + } else if (bodyLength > 125) { + offset += 2; + payloadLength = 126; + } + const buffer = Buffer.allocUnsafe(bodyLength + offset); + buffer[0] = buffer[1] = 0; + buffer[0] |= 128; + buffer[0] = (buffer[0] & 240) + opcode; + buffer[offset - 4] = this.maskKey[0]; + buffer[offset - 3] = this.maskKey[1]; + buffer[offset - 2] = this.maskKey[2]; + buffer[offset - 1] = this.maskKey[3]; + buffer[1] = payloadLength; + if (payloadLength === 126) { + buffer.writeUInt16BE(bodyLength, 2); + } else if (payloadLength === 127) { + buffer[2] = buffer[3] = 0; + buffer.writeUIntBE(bodyLength, 4, 6); + } + buffer[1] |= 128; + for (let i = 0; i < bodyLength; i++) { + buffer[offset + i] = this.frameData[i] ^ this.maskKey[i % 4]; + } + return buffer; + } + }; + module2.exports = { + WebsocketFrameSend + }; + } +}); + +// lib/websocket/receiver.js +var require_receiver = __commonJS({ + "lib/websocket/receiver.js"(exports2, module2) { + "use strict"; + var { Writable } = require("stream"); + var diagnosticsChannel = require("diagnostics_channel"); + var { parserStates, opcodes, states, emptyBuffer } = require_constants3(); + var { kReadyState, kSentClose, kResponse, kReceivedClose } = require_symbols3(); + var { isValidStatusCode, failWebsocketConnection, websocketMessageReceived } = require_util3(); + var { WebsocketFrameSend } = require_frame(); + var channels = {}; + channels.ping = diagnosticsChannel.channel("undici:websocket:ping"); + channels.pong = diagnosticsChannel.channel("undici:websocket:pong"); + var ByteParser = class extends Writable { + #buffers = []; + #byteOffset = 0; + #state = parserStates.INFO; + #info = {}; + #fragments = []; + constructor(ws) { + super(); + this.ws = ws; + } + _write(chunk, _, callback) { + this.#buffers.push(chunk); + this.#byteOffset += chunk.length; + this.run(callback); + } + run(callback) { + while (true) { + if (this.#state === parserStates.INFO) { + if (this.#byteOffset < 2) { + return callback(); + } + const buffer = this.consume(2); + this.#info.fin = (buffer[0] & 128) !== 0; + this.#info.opcode = buffer[0] & 15; + this.#info.originalOpcode ??= this.#info.opcode; + this.#info.fragmented = !this.#info.fin && this.#info.opcode !== opcodes.CONTINUATION; + if (this.#info.fragmented && this.#info.opcode !== opcodes.BINARY && this.#info.opcode !== opcodes.TEXT) { + failWebsocketConnection(this.ws, "Invalid frame type was fragmented."); + return; + } + const payloadLength = buffer[1] & 127; + if (payloadLength <= 125) { + this.#info.payloadLength = payloadLength; + this.#state = parserStates.READ_DATA; + } else if (payloadLength === 126) { + this.#state = parserStates.PAYLOADLENGTH_16; + } else if (payloadLength === 127) { + this.#state = parserStates.PAYLOADLENGTH_64; + } + if (this.#info.fragmented && payloadLength > 125) { + failWebsocketConnection(this.ws, "Fragmented frame exceeded 125 bytes."); + return; + } else if ((this.#info.opcode === opcodes.PING || this.#info.opcode === opcodes.PONG || this.#info.opcode === opcodes.CLOSE) && payloadLength > 125) { + failWebsocketConnection(this.ws, "Payload length for control frame exceeded 125 bytes."); + return; + } else if (this.#info.opcode === opcodes.CLOSE) { + if (payloadLength === 1) { + failWebsocketConnection(this.ws, "Received close frame with a 1-byte body."); + return; + } + const body = this.consume(payloadLength); + this.#info.closeInfo = this.parseCloseBody(false, body); + if (!this.ws[kSentClose]) { + const body2 = Buffer.allocUnsafe(2); + body2.writeUInt16BE(this.#info.closeInfo.code, 0); + const closeFrame = new WebsocketFrameSend(body2); + this.ws[kResponse].socket.write(closeFrame.createFrame(opcodes.CLOSE), (err) => { + if (!err) { + this.ws[kSentClose] = true; + } + }); + } + this.ws[kReadyState] = states.CLOSING; + this.ws[kReceivedClose] = true; + this.end(); + return; + } else if (this.#info.opcode === opcodes.PING) { + const body = this.consume(payloadLength); + if (!this.ws[kReceivedClose]) { + const frame = new WebsocketFrameSend(body); + this.ws[kResponse].socket.write(frame.createFrame(opcodes.PONG)); + if (channels.ping.hasSubscribers) { + channels.ping.publish({ + payload: body + }); + } + } + this.#state = parserStates.INFO; + if (this.#byteOffset > 0) { + continue; + } else { + callback(); + return; + } + } else if (this.#info.opcode === opcodes.PONG) { + const body = this.consume(payloadLength); + if (channels.pong.hasSubscribers) { + channels.pong.publish({ + payload: body + }); + } + if (this.#byteOffset > 0) { + continue; + } else { + callback(); + return; + } + } + } else if (this.#state === parserStates.PAYLOADLENGTH_16) { + if (this.#byteOffset < 2) { + return callback(); + } + const buffer = this.consume(2); + this.#info.payloadLength = buffer.readUInt16BE(0); + this.#state = parserStates.READ_DATA; + } else if (this.#state === parserStates.PAYLOADLENGTH_64) { + if (this.#byteOffset < 8) { + return callback(); + } + const buffer = this.consume(8); + const upper = buffer.readUInt32BE(0); + if (upper > 2 ** 31 - 1) { + failWebsocketConnection(this.ws, "Received payload length > 2^31 bytes."); + return; + } + const lower = buffer.readUInt32BE(4); + this.#info.payloadLength = (upper << 8) + lower; + this.#state = parserStates.READ_DATA; + } else if (this.#state === parserStates.READ_DATA) { + if (this.#byteOffset < this.#info.payloadLength) { + return callback(); + } else if (this.#byteOffset >= this.#info.payloadLength) { + const body = this.consume(this.#info.payloadLength); + this.#fragments.push(body); + if (!this.#info.fragmented || this.#info.fin && this.#info.opcode === opcodes.CONTINUATION) { + const fullMessage = Buffer.concat(this.#fragments); + websocketMessageReceived(this.ws, this.#info.originalOpcode, fullMessage); + this.#info = {}; + this.#fragments.length = 0; + } + this.#state = parserStates.INFO; + } + } + if (this.#byteOffset > 0) { + continue; + } else { + callback(); + break; + } + } + } + consume(n) { + if (n > this.#byteOffset) { + return null; + } else if (n === 0) { + return emptyBuffer; + } + if (this.#buffers[0].length === n) { + this.#byteOffset -= this.#buffers[0].length; + return this.#buffers.shift(); + } + const buffer = Buffer.allocUnsafe(n); + let offset = 0; + while (offset !== n) { + const next = this.#buffers[0]; + const { length } = next; + if (length + offset === n) { + buffer.set(this.#buffers.shift(), offset); + break; + } else if (length + offset > n) { + buffer.set(next.subarray(0, n - offset), offset); + this.#buffers[0] = next.subarray(n - offset); + break; + } else { + buffer.set(this.#buffers.shift(), offset); + offset += next.length; + } + } + this.#byteOffset -= n; + return buffer; + } + parseCloseBody(onlyCode, data) { + let code; + if (data.length >= 2) { + code = data.readUInt16BE(0); + } + if (onlyCode) { + if (!isValidStatusCode(code)) { + return null; + } + return { code }; + } + let reason = data.subarray(2); + if (reason[0] === 239 && reason[1] === 187 && reason[2] === 191) { + reason = reason.subarray(3); + } + if (code !== void 0 && !isValidStatusCode(code)) { + return null; + } + try { + reason = new TextDecoder("utf-8", { fatal: true }).decode(reason); + } catch { + return null; + } + return { code, reason }; + } + get closingInfo() { + return this.#info.closeInfo; + } + }; + module2.exports = { + ByteParser + }; + } +}); + +// lib/websocket/websocket.js +var require_websocket = __commonJS({ + "lib/websocket/websocket.js"(exports2, module2) { + "use strict"; + var { webidl } = require_webidl(); + var { DOMException } = require_constants(); + var { URLSerializer } = require_dataURL(); + var { getGlobalOrigin } = require_global(); + var { staticPropertyDescriptors, states, opcodes, emptyBuffer } = require_constants3(); + var { + kWebSocketURL, + kReadyState, + kController, + kBinaryType, + kResponse, + kSentClose, + kByteParser + } = require_symbols3(); + var { isEstablished, isClosing, isValidSubprotocol, failWebsocketConnection, fireEvent } = require_util3(); + var { establishWebSocketConnection } = require_connection(); + var { WebsocketFrameSend } = require_frame(); + var { ByteParser } = require_receiver(); + var { kEnumerableProperty, isBlobLike } = require_util(); + var { getGlobalDispatcher } = require_global2(); + var { types } = require("util"); + var experimentalWarned = false; + var WebSocket = class extends EventTarget { + #events = { + open: null, + error: null, + close: null, + message: null + }; + #bufferedAmount = 0; + #protocol = ""; + #extensions = ""; + constructor(url, protocols = []) { + super(); + webidl.argumentLengthCheck(arguments, 1, { header: "WebSocket constructor" }); + if (!experimentalWarned) { + experimentalWarned = true; + process.emitWarning("WebSockets are experimental, expect them to change at any time.", { + code: "UNDICI-WS" + }); + } + const options = webidl.converters["DOMString or sequence or WebSocketInit"](protocols); + url = webidl.converters.USVString(url); + protocols = options.protocols; + const baseURL = getGlobalOrigin(); + let urlRecord; + try { + urlRecord = new URL(url, baseURL); + } catch (e) { + throw new DOMException(e, "SyntaxError"); + } + if (urlRecord.protocol === "http:") { + urlRecord.protocol = "ws:"; + } else if (urlRecord.protocol === "https:") { + urlRecord.protocol = "wss:"; + } + if (urlRecord.protocol !== "ws:" && urlRecord.protocol !== "wss:") { + throw new DOMException(`Expected a ws: or wss: protocol, got ${urlRecord.protocol}`, "SyntaxError"); + } + if (urlRecord.hash || urlRecord.href.endsWith("#")) { + throw new DOMException("Got fragment", "SyntaxError"); + } + if (typeof protocols === "string") { + protocols = [protocols]; + } + if (protocols.length !== new Set(protocols.map((p) => p.toLowerCase())).size) { + throw new DOMException("Invalid Sec-WebSocket-Protocol value", "SyntaxError"); + } + if (protocols.length > 0 && !protocols.every((p) => isValidSubprotocol(p))) { + throw new DOMException("Invalid Sec-WebSocket-Protocol value", "SyntaxError"); + } + this[kWebSocketURL] = new URL(urlRecord.href); + this[kController] = establishWebSocketConnection(urlRecord, protocols, this, (response) => this.#onConnectionEstablished(response), options); + this[kReadyState] = WebSocket.CONNECTING; + this[kBinaryType] = "blob"; + } + close(code = void 0, reason = void 0) { + webidl.brandCheck(this, WebSocket); + if (code !== void 0) { + code = webidl.converters["unsigned short"](code, { clamp: true }); + } + if (reason !== void 0) { + reason = webidl.converters.USVString(reason); + } + if (code !== void 0) { + if (code !== 1e3 && (code < 3e3 || code > 4999)) { + throw new DOMException("invalid code", "InvalidAccessError"); + } + } + let reasonByteLength = 0; + if (reason !== void 0) { + reasonByteLength = Buffer.byteLength(reason); + if (reasonByteLength > 123) { + throw new DOMException(`Reason must be less than 123 bytes; received ${reasonByteLength}`, "SyntaxError"); + } + } + if (this[kReadyState] === WebSocket.CLOSING || this[kReadyState] === WebSocket.CLOSED) { + } else if (!isEstablished(this)) { + failWebsocketConnection(this, "Connection was closed before it was established."); + this[kReadyState] = WebSocket.CLOSING; + } else if (!isClosing(this)) { + const frame = new WebsocketFrameSend(); + if (code !== void 0 && reason === void 0) { + frame.frameData = Buffer.allocUnsafe(2); + frame.frameData.writeUInt16BE(code, 0); + } else if (code !== void 0 && reason !== void 0) { + frame.frameData = Buffer.allocUnsafe(2 + reasonByteLength); + frame.frameData.writeUInt16BE(code, 0); + frame.frameData.write(reason, 2, "utf-8"); + } else { + frame.frameData = emptyBuffer; + } + const socket = this[kResponse].socket; + socket.write(frame.createFrame(opcodes.CLOSE), (err) => { + if (!err) { + this[kSentClose] = true; + } + }); + this[kReadyState] = states.CLOSING; + } else { + this[kReadyState] = WebSocket.CLOSING; + } + } + send(data) { + webidl.brandCheck(this, WebSocket); + webidl.argumentLengthCheck(arguments, 1, { header: "WebSocket.send" }); + data = webidl.converters.WebSocketSendData(data); + if (this[kReadyState] === WebSocket.CONNECTING) { + throw new DOMException("Sent before connected.", "InvalidStateError"); + } + if (!isEstablished(this) || isClosing(this)) { + return; + } + const socket = this[kResponse].socket; + if (typeof data === "string") { + const value = Buffer.from(data); + const frame = new WebsocketFrameSend(value); + const buffer = frame.createFrame(opcodes.TEXT); + this.#bufferedAmount += value.byteLength; + socket.write(buffer, () => { + this.#bufferedAmount -= value.byteLength; + }); + } else if (types.isArrayBuffer(data)) { + const value = Buffer.from(data); + const frame = new WebsocketFrameSend(value); + const buffer = frame.createFrame(opcodes.BINARY); + this.#bufferedAmount += value.byteLength; + socket.write(buffer, () => { + this.#bufferedAmount -= value.byteLength; + }); + } else if (ArrayBuffer.isView(data)) { + const ab = Buffer.from(data, data.byteOffset, data.byteLength); + const frame = new WebsocketFrameSend(ab); + const buffer = frame.createFrame(opcodes.BINARY); + this.#bufferedAmount += ab.byteLength; + socket.write(buffer, () => { + this.#bufferedAmount -= ab.byteLength; + }); + } else if (isBlobLike(data)) { + const frame = new WebsocketFrameSend(); + data.arrayBuffer().then((ab) => { + const value = Buffer.from(ab); + frame.frameData = value; + const buffer = frame.createFrame(opcodes.BINARY); + this.#bufferedAmount += value.byteLength; + socket.write(buffer, () => { + this.#bufferedAmount -= value.byteLength; + }); + }); + } + } + get readyState() { + webidl.brandCheck(this, WebSocket); + return this[kReadyState]; + } + get bufferedAmount() { + webidl.brandCheck(this, WebSocket); + return this.#bufferedAmount; + } + get url() { + webidl.brandCheck(this, WebSocket); + return URLSerializer(this[kWebSocketURL]); + } + get extensions() { + webidl.brandCheck(this, WebSocket); + return this.#extensions; + } + get protocol() { + webidl.brandCheck(this, WebSocket); + return this.#protocol; + } + get onopen() { + webidl.brandCheck(this, WebSocket); + return this.#events.open; + } + set onopen(fn) { + webidl.brandCheck(this, WebSocket); + if (this.#events.open) { + this.removeEventListener("open", this.#events.open); + } + if (typeof fn === "function") { + this.#events.open = fn; + this.addEventListener("open", fn); + } else { + this.#events.open = null; + } + } + get onerror() { + webidl.brandCheck(this, WebSocket); + return this.#events.error; + } + set onerror(fn) { + webidl.brandCheck(this, WebSocket); + if (this.#events.error) { + this.removeEventListener("error", this.#events.error); + } + if (typeof fn === "function") { + this.#events.error = fn; + this.addEventListener("error", fn); + } else { + this.#events.error = null; + } + } + get onclose() { + webidl.brandCheck(this, WebSocket); + return this.#events.close; + } + set onclose(fn) { + webidl.brandCheck(this, WebSocket); + if (this.#events.close) { + this.removeEventListener("close", this.#events.close); + } + if (typeof fn === "function") { + this.#events.close = fn; + this.addEventListener("close", fn); + } else { + this.#events.close = null; + } + } + get onmessage() { + webidl.brandCheck(this, WebSocket); + return this.#events.message; + } + set onmessage(fn) { + webidl.brandCheck(this, WebSocket); + if (this.#events.message) { + this.removeEventListener("message", this.#events.message); + } + if (typeof fn === "function") { + this.#events.message = fn; + this.addEventListener("message", fn); + } else { + this.#events.message = null; + } + } + get binaryType() { + webidl.brandCheck(this, WebSocket); + return this[kBinaryType]; + } + set binaryType(type) { + webidl.brandCheck(this, WebSocket); + if (type !== "blob" && type !== "arraybuffer") { + this[kBinaryType] = "blob"; + } else { + this[kBinaryType] = type; + } + } + #onConnectionEstablished(response) { + this[kResponse] = response; + const parser = new ByteParser(this); + parser.on("drain", function onParserDrain() { + this.ws[kResponse].socket.resume(); + }); + response.socket.ws = this; + this[kByteParser] = parser; + this[kReadyState] = states.OPEN; + const extensions = response.headersList.get("sec-websocket-extensions"); + if (extensions !== null) { + this.#extensions = extensions; + } + const protocol = response.headersList.get("sec-websocket-protocol"); + if (protocol !== null) { + this.#protocol = protocol; + } + fireEvent("open", this); + } + }; + WebSocket.CONNECTING = WebSocket.prototype.CONNECTING = states.CONNECTING; + WebSocket.OPEN = WebSocket.prototype.OPEN = states.OPEN; + WebSocket.CLOSING = WebSocket.prototype.CLOSING = states.CLOSING; + WebSocket.CLOSED = WebSocket.prototype.CLOSED = states.CLOSED; + Object.defineProperties(WebSocket.prototype, { + CONNECTING: staticPropertyDescriptors, + OPEN: staticPropertyDescriptors, + CLOSING: staticPropertyDescriptors, + CLOSED: staticPropertyDescriptors, + url: kEnumerableProperty, + readyState: kEnumerableProperty, + bufferedAmount: kEnumerableProperty, + onopen: kEnumerableProperty, + onerror: kEnumerableProperty, + onclose: kEnumerableProperty, + close: kEnumerableProperty, + onmessage: kEnumerableProperty, + binaryType: kEnumerableProperty, + send: kEnumerableProperty, + extensions: kEnumerableProperty, + protocol: kEnumerableProperty, + [Symbol.toStringTag]: { + value: "WebSocket", + writable: false, + enumerable: false, + configurable: true + } + }); + Object.defineProperties(WebSocket, { + CONNECTING: staticPropertyDescriptors, + OPEN: staticPropertyDescriptors, + CLOSING: staticPropertyDescriptors, + CLOSED: staticPropertyDescriptors + }); + webidl.converters["sequence"] = webidl.sequenceConverter(webidl.converters.DOMString); + webidl.converters["DOMString or sequence"] = function(V) { + if (webidl.util.Type(V) === "Object" && Symbol.iterator in V) { + return webidl.converters["sequence"](V); + } + return webidl.converters.DOMString(V); + }; + webidl.converters.WebSocketInit = webidl.dictionaryConverter([ + { + key: "protocols", + converter: webidl.converters["DOMString or sequence"], + get defaultValue() { + return []; + } + }, + { + key: "dispatcher", + converter: (V) => V, + get defaultValue() { + return getGlobalDispatcher(); + } + }, + { + key: "headers", + converter: webidl.nullableConverter(webidl.converters.HeadersInit) + } + ]); + webidl.converters["DOMString or sequence or WebSocketInit"] = function(V) { + if (webidl.util.Type(V) === "Object" && !(Symbol.iterator in V)) { + return webidl.converters.WebSocketInit(V); + } + return { protocols: webidl.converters["DOMString or sequence"](V) }; + }; + webidl.converters.WebSocketSendData = function(V) { + if (webidl.util.Type(V) === "Object") { + if (isBlobLike(V)) { + return webidl.converters.Blob(V, { strict: false }); + } + if (ArrayBuffer.isView(V) || types.isAnyArrayBuffer(V)) { + return webidl.converters.BufferSource(V); + } + } + return webidl.converters.USVString(V); + }; + module2.exports = { + WebSocket + }; + } +}); + // index-fetch.js var fetchImpl = require_fetch().fetch; -module.exports.fetch = async function fetch(resource) { +module.exports.fetch = async function fetch(resource, init = void 0) { try { - return await fetchImpl(...arguments); + return await fetchImpl(resource, init); } catch (err) { Error.captureStackTrace(err, this); throw err; @@ -11605,4 +13225,6 @@ module.exports.FormData = require_formdata().FormData; module.exports.Headers = require_headers().Headers; module.exports.Response = require_response().Response; module.exports.Request = require_request().Request; +module.exports.WebSocket = require_websocket().WebSocket; /*! formdata-polyfill. MIT License. Jimmy Wärting */ +/*! ws. MIT License. Einar Otto Stangvik */ diff --git a/doc/contributing/maintaining/maintaining-dependencies.md b/doc/contributing/maintaining/maintaining-dependencies.md index a8d6e5b7ccf4c0..6f84a92ebb9385 100644 --- a/doc/contributing/maintaining/maintaining-dependencies.md +++ b/doc/contributing/maintaining/maintaining-dependencies.md @@ -28,7 +28,7 @@ This a list of all the dependencies: * [openssl 3.0.8][] * [postject 1.0.0-alpha.6][] * [simdutf 3.2.17][] -* [undici 5.23.0][] +* [undici 5.25.2][] * [uvwasi 0.0.16][] * [V8 11.3.244.8][] * [zlib 1.2.13.1-motley-f5fd0ad][] @@ -291,7 +291,7 @@ The [postject](https://github.com/nodejs/postject) dependency is used for the The [simdutf](https://github.com/simdutf/simdutf) dependency is a C++ library for fast UTF-8 decoding and encoding. -### undici 5.23.0 +### undici 5.25.2 The [undici](https://github.com/nodejs/undici) dependency is an HTTP/1.1 client, written from scratch for Node.js.. @@ -345,7 +345,7 @@ performance improvements not currently available in standard zlib. [openssl 3.0.8]: #openssl-308 [postject 1.0.0-alpha.6]: #postject-100-alpha6 [simdutf 3.2.17]: #simdutf-3217 -[undici 5.23.0]: #undici-5230 +[undici 5.25.2]: #undici-5252 [update-openssl-action]: ../../../.github/workflows/update-openssl.yml [uvwasi 0.0.16]: #uvwasi-0016 [v8 11.3.244.8]: #v8-1132448 diff --git a/src/undici_version.h b/src/undici_version.h index 47aef25a73212d..d47c6d538b7355 100644 --- a/src/undici_version.h +++ b/src/undici_version.h @@ -2,5 +2,5 @@ // Refer to tools/update-undici.sh #ifndef SRC_UNDICI_VERSION_H_ #define SRC_UNDICI_VERSION_H_ -#define UNDICI_VERSION "5.23.0" +#define UNDICI_VERSION "5.25.2" #endif // SRC_UNDICI_VERSION_H_