-
Notifications
You must be signed in to change notification settings - Fork 752
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chunked insertion (by size of blocks) #3147
Comments
addressed in PR #3122 |
Updated (2022-03-30): Although this issue has been fixed by PR #3122, later a strategy of merging small blocks into bigger ones by the number of rows is used. It would be better if both bock_size(in bytes, before compression, imo) and nubmer_of_rows are taken into account while merging small blocks into larger ones. such that the size of memory used to construct the block will no longer increase proportionally to the number of columns of the table. current impl: |
Hello @dantengsky , |
Hope I get it right, in issue #4577, expressions (values of batched insert statements) are going to be compacted into blocks of proper size if applicable and then insert into a table by calling
|
Async Insert is used to aggregate multi-client's small insert. Seems it's pretty different with this I think. |
I got, thank you man! |
@dantengsky We already have |
staled |
Summary
Original:
Updated (2022-03-30):
Although this issue has been fixed by PR #3122, later a strategy of merging small blocks into bigger ones by the number of rows is used.
It would be better if both bock_size(in bytes, before compression, imo) and number_rows are taken into account while merging small blocks into larger ones. such that the size of memory used to construct the block will no longer increase proportionally to the number of columns of the table.
current impl:
https://github.com/datafuselabs/databend/blob/50226158b19380f06d36edb759f98482027c51db/query/src/storages/fuse/io/write/block_stream_writer.rs#L278-L304
The text was updated successfully, but these errors were encountered: