implement bulk_add for applying several SQLite operations per transaction #3300
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Initially inspired by #1935 and seemingly relevant to #2388
Based on my
snakeviz
benchmarks in #1935 (comment) and #1935 (comment),dbcore
overhead for an import operation on 847 local files was reduced from 95% (153 seconds) to <10% (5 seconds) .I have exercised this patch with a dbcore frontend I built to organize, query, and de-duplicate arbitrary (including non-music) files. In order to realize these performance benefits in beets, library import will need to be modified to make use this method.
N.B. The degree of batching chosen for
bulk_add
is a trade-off between buffering delay, CPU time, and peak memory usage. I've found negligible memory usage for single transactions containing tens of thousands of model records.