Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Execution Model Inversion #2666
Execution Model Inversion #2666
Changes from 30 commits
36b2214
e4e20d7
2c7145d
12627ca
9c1e3f7
508d286
fff2283
e60dbe3
5ab1565
6d09dd7
6b6a93c
03394ac
a0bf532
5dc1365
dd3bafb
7dbee88
b5e4583
75774c6
ecbef30
1f06588
2dda3f2
06f3ce9
b3e547f
fa48ad3
afa4c7b
8d17f3c
9d62456
4712df8
85d046b
48d03c4
64e3a43
bb5de4d
c4666bf
887ceb3
655548d
36131f0
fd7229e
e73c917
519d08e
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Question: what happens when
self.max_size
is less than the number of nodes in the workflow? AFAIU, all nodes will be cached anyway until the next call toclean_unused
, which happens on the next prompt? If so, it seems like an opportunity to save some memory.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, the
LRU
caching method will always keep around at least one prompt worth of cache (just like the default caching method). If we immediately cleared out the cache after the workflow, it would technically save memory while ComfyUI was sitting idle, but would entirely break caching for extremely large workflows.Because the
LRU
caching method is specifically intended as a more memory-intensive (but more convenient) caching mode for people with plenty of RAM/VRAM, I don't think it makes sense for it to be more conservative with memory than the default caching method.