-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Maps] chunk geojson upload to keep import requests under 1MB #93678
Conversation
Pinging @elastic/kibana-gis (Team:Geo) |
@elasticmachine merge upstream |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thx. I tested this on a number of different files. Seems to work really well. It's a very smooth experience.
💚 Build SucceededMetrics [docs]Async chunks
History
To update your PR or re-run it, just comment with: |
…c#93678) * [Maps] chunk geojson upload to keep import requests under 1MB * fix geojson_importer tests * update failure.item to reflect location in file * remove console statement * clean up * return instead of break if upload is no longer active * add unit test for createChunks * update file_upload API Co-authored-by: Kibana Machine <42973632+kibanamachine@users.noreply.github.com>
#93974) * [Maps] chunk geojson upload to keep import requests under 1MB * fix geojson_importer tests * update failure.item to reflect location in file * remove console statement * clean up * return instead of break if upload is no longer active * add unit test for createChunks * update file_upload API Co-authored-by: Kibana Machine <42973632+kibanamachine@users.noreply.github.com> Co-authored-by: Kibana Machine <42973632+kibanamachine@users.noreply.github.com>
Should fix #72985
#92620 updated GeoJson importer to stream files in 10MB blocks for importing. Upon trying this in cloud, 10MB blocks were still too large for cloud and result in timeouts. Upon deeper inspection of ML's CSV upload, which can upload large files on cloud, it became apparent that blocks need to be broken into chunks of less then 1MB chunks for importing. There is an edge case where an individual feature is larger than 1MB. In this case, the individual feature can not be broken down any smaller and must be imported in a single call.
Unfortunately, there is no way to test this on cloud without merging and waiting for the next snapshot to be built.