-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cli/dump: dump produces inserts that are too large to be restored #31676
Comments
Oh that is nicely spotted. As a workaround, you can modify The proper fix on our side will probably be to auto-detect a good size. |
cc @rolandcrosby @mjibson for triage and prioritization |
Oh that is nicely spotted. As a workaround, you can modify The proper fix on our side will probably be to auto-detect a good size. |
@mjibson, could you help István out? |
I had sent a reply earlier today but I think it got lost in the Github
outage.
My answer was as follows:
- as a workaround you can edit the value of `insertRows` in
`pkg/cli/dump.go` to build a custom dump utility with fewer values per
insert.
- eventually CockroachDb should address this by adjusting the number of
values per insert automatically based on the size of the data.
Thanks for reporting the issue.
…--
Raphael 'kena' Poss
|
This is an extension of #28948. |
any news on this? maybe a workaround? can I help somehow? (I reported the #51969) |
my comment from #31676 (comment) is still applicable |
20.2 made proper binary BACKUP+RESTORE available for non-enterprise users and deprecated text-based |
I've created a database dump using
cockroach dump [database] >output.sql
. However I'm not able to restore it, neither with nativepsql
nor with thecockroach sql
commands, most likely because the generated inserts are too large.One of the tables there has rows with binary data, 1MB in size each, and the dump seems to create insert statements in the batch of 100 rows, regardless of the size. I think the inserts fail because the aggregated size is larger than the 64M limit.
The text was updated successfully, but these errors were encountered: