Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor(core): Centralize scaling mode (no-changelog) #9835

Merged
merged 34 commits into from
Aug 7, 2024

Conversation

ivov
Copy link
Contributor

@ivov ivov commented Jun 21, 2024

Centralize scaling mode logic to pave the way for adding tests, upgrading lib, etc.

@n8n-assistant n8n-assistant bot added core Enhancement outside /nodes-base and /editor-ui n8n team Authored by the n8n team labels Jun 21, 2024
Copy link
Contributor

@krynble krynble left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

General direction is looking good.

I was thinking if we could leverage ActiveExecutions' existing functionality as part of the Worker.ts's runningJobsSummary but that could be just an enhancement for the future.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hopefully we can get rid of all the watchdog functionality

Copy link
Member

@netroy netroy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should do this in 2 parts:

  • part 1 should be the refactor that reorganizes most of our existing code in packages/cli/src/scaling-mode in a way that we can also add lots of unit tests for
  • part 2 is when we actually make the switch. This should be in v2.

It's entirely possible that we might be able to address some of the issues without the bullmq upgrade, by simply moving away from the super-entangled code that the current worker stuff is.

@ivov ivov changed the title refactor(core)!: Upgrade scaling mode to bullmq refactor(core) Centralize scaling mode (no-changelog) Jul 5, 2024
@ivov ivov changed the title refactor(core) Centralize scaling mode (no-changelog) refactor(core): Centralize scaling mode (no-changelog) Jul 5, 2024
@ivov ivov marked this pull request as ready for review July 9, 2024 15:17
@netroy netroy self-requested a review July 9, 2024 15:50
Copy link
Contributor

@despairblue despairblue left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good.

I left some comments, mostly our of curiosity and some suggestions.

packages/cli/test/integration/commands/worker.cmd.test.ts Outdated Show resolved Hide resolved
packages/cli/src/scaling/types.ts Outdated Show resolved Hide resolved
packages/cli/src/scaling/scaling.service.ts Outdated Show resolved Hide resolved
packages/cli/src/scaling/scaling.service.ts Outdated Show resolved Hide resolved
packages/cli/src/services/orchestration/worker/types.ts Outdated Show resolved Hide resolved
packages/cli/src/commands/worker.ts Outdated Show resolved Hide resolved
packages/cli/src/scaling/scaling.service.ts Show resolved Hide resolved
packages/cli/src/scaling/scaling.service.ts Outdated Show resolved Hide resolved
Copy link
Contributor

@tomi tomi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like how the code is now better organized into more meaningfully named classes. Left a couple small comments about code documentation, and questions about error handling. Since this has been reviewed already by multiple people before me I trust that the functionality itself works as expected

packages/cli/src/WorkflowRunner.ts Outdated Show resolved Hide resolved
packages/cli/src/WorkflowRunner.ts Outdated Show resolved Hide resolved
packages/cli/src/WorkflowRunner.ts Show resolved Hide resolved
packages/cli/src/commands/worker.ts Show resolved Hide resolved
packages/cli/src/commands/worker.ts Outdated Show resolved Hide resolved
packages/cli/src/scaling/processor.ts Outdated Show resolved Hide resolved
packages/cli/src/scaling/processor.ts Outdated Show resolved Hide resolved
packages/cli/src/scaling/scaling.service.ts Outdated Show resolved Hide resolved
packages/cli/src/scaling/scaling.service.ts Show resolved Hide resolved
packages/cli/src/scaling/scaling.service.ts Outdated Show resolved Hide resolved
@ivov
Copy link
Contributor Author

ivov commented Aug 6, 2024

@tomi Thanks for reviewing!

Since this has been reviewed already by multiple people before me I trust that the functionality itself works as expected

On the contrary :) The reviews have been code-only - I've tested this PR locally, but I'd be thankful if you also can put this through its paces and see if you find any odd behavior, esp. when cancelling jobs, having main respond to webhooks, etc.

@tomi
Copy link
Contributor

tomi commented Aug 6, 2024

@ivov I did some testing. Found at least one issue. I canceled an execution that was in Queued state (I didn't have any worker processes running).

TypeError: Cannot destructure property 'data' of 'object null' as it is null.
    at /n8n/packages/cli/src/executions/execution.service.ts:473:28
    at Array.find (<anonymous>)
    at ExecutionService.stopInScalingMode (/n8n/packages/cli/src/executions/execution.service.ts:473:20)
    at processTicksAndRejections (node:internal/process/task_queues:95:5)
    at ExecutionService.stop (/n8n/packages/cli/src/executions/execution.service.ts:419:7)
    at ExecutionsController.stop (/n8n/packages/cli/src/executions/executions.controller.ts:83:10)
    at handler (/n8n/packages/cli/src/decorators/controller.registry.ts:79:5)
    at /n8n/packages/cli/src/ResponseHelper.ts:153:17 undefined

@tomi
Copy link
Contributor

tomi commented Aug 6, 2024

I also managed to somehow stop the worker from picking up any new work: I set worker concurrency to 1 and had multiple execution queued. I then cancelled the running execution. It did cancel but the worker didn't pick up any new work anymore

Worker logs:

n8n worker is now ready
 * Version: 1.53.0
 * Concurrency: 1

[JobProcessor] Starting job 9 (execution 11)
[JobProcessor] Starting job 10 (execution 12)

Redis has:

127.0.0.1:6379> KEYS *
 1) "bull:jobs:id"
 2) "bull:jobs:wait"
 3) "bull:jobs:22"
 4) "bull:jobs:active"
 5) "bull:jobs:17"
 6) "bull:jobs:priority"
 7) "bull:jobs:26"
 8) "n8n:redis:workflow-project"
 9) "n8n:redis:webhook:GET-5a35a610-2c19-425c-a94e-636b21fb5469"
10) "bull:jobs:23"
11) "bull:jobs:10:lock"
12) "n8n:redis:webhook:GET-d2eba2ef-58eb-4f50-afd2-76fc57f74d51"
13) "bull:jobs:27"
14) "bull:jobs:21"
15) "bull:jobs:20"
16) "bull:jobs:19"
17) "bull:jobs:5"
18) "bull:jobs:13"
19) "bull:jobs:28"
20) "bull:jobs:25"
21) "bull:jobs:7"
22) "bull:jobs:16"
23) "bull:jobs:14"
24) "n8n:redis:project-owner"
25) "bull:jobs:10"
26) "bull:jobs:18"
27) "n8n:redis:variables"
28) "bull:jobs:12"
29) "n8n:redis:webhook:POST-22ee5f69-6405-4057-b969-37f5f5982696"
30) "bull:jobs:15"
31) "bull:jobs:24"
127.0.0.1:6379> HGETALL bull:jobs:10
 1) "progress"
 2) "{\"kind\":\"abort-job\"}"
 3) "name"
 4) "job"
 5) "data"
 6) "{\"executionId\":\"12\",\"loadStaticData\":true}"
 7) "opts"
 8) "{\"priority\":100,\"removeOnComplete\":true,\"removeOnFail\":true,\"attempts\":1,\"delay\":0,\"timestamp\":1722936630038}"
 9) "processedOn"
10) "1722936747680"
11) "delay"
12) "0"
13) "timestamp"
14) "1722936630038"
15) "priority"
16) "100"
127.0.0.1:6379> GET bull:jobs:10:lock
"808ced76-5e28-46a9-bbee-0b6a66ce299e"

After I stopped the worker and started a new worker the main instance logged this:

Error: Cannot read properties of undefined (reading 'node')
    at Queue.onFailed (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/job.js:516:18)
    at Queue.emit (node:events:531:35)
    at Queue.emit (node:domain:488:12)
    at Object.module.exports.emitSafe (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/utils.js:50:20)
    at EventEmitter.messageHandler (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/queue.js:476:15)
    at EventEmitter.emit (node:events:519:28)
    at DataHandler.handleSubscriberReply (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/DataHandler.js:80:32)
    at DataHandler.returnReply (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/DataHandler.js:47:18)
    at JavascriptRedisParser.returnReply (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/DataHandler.js:21:22)
    at JavascriptRedisParser.execute (/n8n/node_modules/.pnpm/redis-parser@3.0.0/node_modules/redis-parser/lib/parser.js:544:14) undefined
Problem with execution 9: Cannot read properties of undefined (reading 'node'). Aborting.
Error: Cannot read properties of undefined (reading 'node')
    at Queue.onFailed (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/job.js:516:18)
    at Queue.emit (node:events:531:35)
    at Queue.emit (node:domain:488:12)
    at Object.module.exports.emitSafe (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/utils.js:50:20)
    at EventEmitter.messageHandler (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/queue.js:476:15)
    at EventEmitter.emit (node:events:519:28)
    at DataHandler.handleSubscriberReply (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/DataHandler.js:80:32)
    at DataHandler.returnReply (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/DataHandler.js:47:18)
    at JavascriptRedisParser.returnReply (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/DataHandler.js:21:22)
    at JavascriptRedisParser.execute (/n8n/node_modules/.pnpm/redis-parser@3.0.0/node_modules/redis-parser/lib/parser.js:544:14) undefined
Error: Cannot read properties of undefined (reading 'nodeExecutionStack')
    at Queue.onFailed (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/job.js:516:18)
    at Queue.emit (node:events:531:35)
    at Queue.emit (node:domain:488:12)
    at Object.module.exports.emitSafe (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/utils.js:50:20)
    at EventEmitter.messageHandler (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/queue.js:476:15)
    at EventEmitter.emit (node:events:519:28)
    at DataHandler.handleSubscriberReply (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/DataHandler.js:80:32)
    at DataHandler.returnReply (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/DataHandler.js:47:18)
    at JavascriptRedisParser.returnReply (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/DataHandler.js:21:22)
    at JavascriptRedisParser.execute (/n8n/node_modules/.pnpm/redis-parser@3.0.0/node_modules/redis-parser/lib/parser.js:544:14) undefined
Problem with execution 7: Cannot read properties of undefined (reading 'nodeExecutionStack'). Aborting.
Error: Cannot read properties of undefined (reading 'nodeExecutionStack')
    at Queue.onFailed (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/job.js:516:18)
    at Queue.emit (node:events:531:35)
    at Queue.emit (node:domain:488:12)
    at Object.module.exports.emitSafe (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/utils.js:50:20)
    at EventEmitter.messageHandler (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/queue.js:476:15)
    at EventEmitter.emit (node:events:519:28)
    at DataHandler.handleSubscriberReply (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/DataHandler.js:80:32)
    at DataHandler.returnReply (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/DataHandler.js:47:18)
    at JavascriptRedisParser.returnReply (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/DataHandler.js:21:22)
    at JavascriptRedisParser.execute (/n8n/node_modules/.pnpm/redis-parser@3.0.0/node_modules/redis-parser/lib/parser.js:544:14) undefined
[ScalingService] Added job 40 (execution 42)

@tomi
Copy link
Contributor

tomi commented Aug 6, 2024

The execution duration seems to count the wait time as well (in Queued state). Not sure if this is correct as it has been running only for couple seconds

image

@tomi
Copy link
Contributor

tomi commented Aug 6, 2024

Looks like it's not possible to shutdown the main instance if there are active executions. It just keeps logging over and over again:

Waiting for 33 active executions to finish...

After a while it started also spamming

Error: Connection is closed.
    at EventEmitter.sendCommand (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Redis.js:332:28)
    at Script.execute (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Script.js:59:26)
    at EventEmitter.isFinished (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/utils/Commander.js:111:27)
    at Object.isFinished (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/scripts.js:275:29)
    at Timeout._onTimeout (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/job.js:541:19)
    at listOnTimeout (node:internal/timers:573:17)
    at processTimers (node:internal/timers:514:7) undefined
Error: Connection is closed.
    at EventEmitter.sendCommand (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Redis.js:332:28)
    at Script.execute (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Script.js:59:26)
    at EventEmitter.isFinished (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/utils/Commander.js:111:27)
    at Object.isFinished (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/scripts.js:275:29)
    at Timeout._onTimeout (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/job.js:541:19)
    at listOnTimeout (node:internal/timers:573:17)
    at processTimers (node:internal/timers:514:7) undefined

Eventually it died, but it took way longer than the 30s it said:

process exited after 30s
Error: Shutdown timed out after 30 seconds
    at Start.exitWithCrash (/Users/tomi/work/n8n/n8n/packages/cli/src/commands/BaseCommand.ts:159:23)
    at Timeout._onTimeout (/Users/tomi/work/n8n/n8n/packages/cli/src/commands/BaseCommand.ts:330:16)
    at listOnTimeout (node:internal/timers:573:17)
    at processTimers (node:internal/timers:514:7) {
  [cause]: Error: Shutdown timed out after 30 seconds
      at Timeout._onTimeout (/Users/tomi/work/n8n/n8n/packages/cli/src/commands/BaseCommand.ts:330:40)
      at listOnTimeout (node:internal/timers:573:17)
      at processTimers (node:internal/timers:514:7)
} undefined
Error: Shutdown timed out after 30 seconds
    at Timeout._onTimeout (/Users/tomi/work/n8n/n8n/packages/cli/src/commands/BaseCommand.ts:330:40)
    at listOnTimeout (node:internal/timers:573:17)
    at processTimers (node:internal/timers:514:7) undefined
Waiting for 33 active executions to finish...
Waiting for 33 active executions to finish...
Waiting for 33 active executions to finish...
Waiting for 33 active executions to finish...
Waiting for 33 active executions to finish...
Waiting for 33 active executions to finish...
Waiting for 33 active executions to finish...
Waiting for 33 active executions to finish...
Waiting for 33 active executions to finish...

@tomi
Copy link
Contributor

tomi commented Aug 6, 2024

Shutting down the worker also produced some errors:

[JobProcessor] Starting job 20 (execution 22)
Received SIGINT. Shutting down...
Stopping n8n...
Waiting for 1 active executions to finish... (max wait 30 more seconds)
Waiting for 1 active executions to finish... (max wait 28 more seconds)
Waiting for 1 active executions to finish... (max wait 26 more seconds)
[ScalingService] Queue errored
[ScalingService] Queue errored
Error: Connection is closed.
    at EventEmitter.sendCommand (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Redis.js:332:28)
    at Script.execute (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Script.js:59:26)
    at EventEmitter.updateDelaySet (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/utils/Commander.js:111:27)
    at Object.updateDelaySet (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/scripts.js:405:25)
    at Queue.updateDelayTimer (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/queue.js:959:6)
    at Timeout._onTimeout (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/queue.js:983:49)
    at listOnTimeout (node:internal/timers:573:17)
    at processTimers (node:internal/timers:514:7)
Waiting for 1 active executions to finish... (max wait 24 more seconds)
Waiting for 1 active executions to finish... (max wait 22 more seconds)
[ScalingService] Queue errored
[ScalingService] Queue errored
Error: Connection is closed.
    at EventEmitter.sendCommand (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Redis.js:332:28)
    at Script.execute (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Script.js:59:26)
    at EventEmitter.updateDelaySet (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/utils/Commander.js:111:27)
    at Object.updateDelaySet (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/scripts.js:405:25)
    at Queue.updateDelayTimer (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/queue.js:959:6)
    at Timeout._onTimeout (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/queue.js:998:20)
    at listOnTimeout (node:internal/timers:573:17)
    at processTimers (node:internal/timers:514:7)
Waiting for 1 active executions to finish... (max wait 20 more seconds)
Waiting for 1 active executions to finish... (max wait 18 more seconds)
Waiting for 1 active executions to finish... (max wait 16 more seconds)
[ScalingService] Queue errored
[ScalingService] Queue errored
Error: Connection is closed.
    at EventEmitter.sendCommand (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Redis.js:332:28)
    at Script.execute (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Script.js:59:26)
    at EventEmitter.updateDelaySet (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/utils/Commander.js:111:27)
    at Object.updateDelaySet (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/scripts.js:405:25)
    at Queue.updateDelayTimer (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/queue.js:959:6)
    at Timeout._onTimeout (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/queue.js:998:20)
    at listOnTimeout (node:internal/timers:573:17)
    at processTimers (node:internal/timers:514:7)
Waiting for 1 active executions to finish... (max wait 14 more seconds)
Waiting for 1 active executions to finish... (max wait 12 more seconds)
[ScalingService] Queue errored
[ScalingService] Queue errored
Error: Connection is closed.
    at EventEmitter.sendCommand (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Redis.js:332:28)
    at execPipeline (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Pipeline.js:330:25)
    at Pipeline.exec (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Pipeline.js:282:5)
    at Pipeline.pipeline.exec (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/transaction.js:54:34)
    at Job.moveToFailed (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/job.js:339:31)
    at processTicksAndRejections (node:internal/process/task_queues:95:5) {
  previousErrors: [
    Error: Connection is closed.
        at EventEmitter.sendCommand (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Redis.js:332:28)
        at execPipeline (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Pipeline.js:330:25)
        at Pipeline.exec (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Pipeline.js:282:5)
        at Pipeline.pipeline.exec (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/transaction.js:54:34)
        at Job.moveToFailed (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/job.js:339:31)
        at processTicksAndRejections (node:internal/process/task_queues:95:5),
    Error: Connection is closed.
        at EventEmitter.sendCommand (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Redis.js:332:28)
        at execPipeline (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Pipeline.js:330:25)
        at Pipeline.exec (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Pipeline.js:282:5)
        at Pipeline.pipeline.exec (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/transaction.js:54:34)
        at Job.moveToFailed (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/job.js:339:31)
        at processTicksAndRejections (node:internal/process/task_queues:95:5),
    Error: Connection is closed.
        at EventEmitter.sendCommand (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Redis.js:332:28)
        at execPipeline (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Pipeline.js:330:25)
        at Pipeline.exec (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/Pipeline.js:282:5)
        at Pipeline.pipeline.exec (/n8n/node_modules/.pnpm/ioredis@5.3.2/node_modules/ioredis/built/transaction.js:54:34)
        at Job.moveToFailed (/n8n/node_modules/.pnpm/bull@4.12.1/node_modules/bull/lib/job.js:339:31)
        at processTicksAndRejections (node:internal/process/task_queues:95:5)
  ]
}

After this the execution was left running:

image

@ivov
Copy link
Contributor Author

ivov commented Aug 6, 2024

Migrating discussion to this Notion page to keep it organized. I believe some of these issues are preexisting on master, but we could address them in this PR as well. All these issues are present on master.

@ivov
Copy link
Contributor Author

ivov commented Aug 7, 2024

As discussed, since all issues are on master, we'll track them separately and merge this after today's release.

*/
@Service()
export class JobProcessor {
private readonly runningJobs: { [jobId: JobId]: RunningJob } = {};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This would be more readable:

Suggested change
private readonly runningJobs: { [jobId: JobId]: RunningJob } = {};
private readonly runningJobs: Record<JobId, RunningJob> = {};

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry missed this one, will address in next round 👍🏻

Copy link
Member

@netroy netroy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🙏🏽

@ivov ivov merged commit e31d017 into master Aug 7, 2024
30 of 33 checks passed
@ivov ivov deleted the pay-1658-upgrade-to-bullmq branch August 7, 2024 11:50
MiloradFilipovic added a commit that referenced this pull request Aug 7, 2024
* master:
  refactor(core): Centralize scaling mode (no-changelog) (#9835)
  fix(editor): Remove body padding from storybook previews (no-changelog) (#10317)
  feat(MySQL Node): Return decimal types as numbers (#10313)
  🚀 Release 1.54.0 (#10315)
  feat(Elasticsearch Node): Add bulk operations for Elasticsearch (#9940)
  feat(Stripe Trigger Node): Add Stripe webhook descriptions based on the workflow ID and name (#9956)
  feat(MongoDB Node): Add projection to query options on Find (#9972)
  fix(Invoice Ninja Node): Fix payment types (#10196)
  feat(HTTP Request Tool Node): Use DynamicStructuredTool with models supporting it (no-changelog) (#10246)
  feat: Return scopes on executions (no-changelog) (#10310)
  feat(Webflow Node): Update to use the v2 API (#9996)
  feat(Lemlist Trigger Node): Update Trigger events (#10311)
  feat(Calendly Trigger Node): Update event names (no-changelog) (#10129)
  refactor(core): Reorganize webhook related components under src/webhooks (no-changelog) (#10296)
  docs: Fix links to license files in readme (no-changelog) (#10257)
  fix(editor): Update design system Avatar component to show initials also when only firstName or lastName is given (#10308)
  fix(editor): Update tags filter/editor to not show non existing tag as a selectable option (#10297)
  fix(editor): Update project tabs test (no-changelog) (#10300)
  fix(core): VM2 sandbox should not throw on `new Promise` (#10298)

# Conflicts:
#	packages/design-system/src/components/N8nAvatar/Avatar.vue
@janober
Copy link
Member

janober commented Aug 15, 2024

Got released with n8n@1.55.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
core Enhancement outside /nodes-base and /editor-ui n8n team Authored by the n8n team Released
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants