Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Out of memory when loading a large JSON file. #18

Open
beoran opened this issue Nov 28, 2019 · 4 comments
Open

Out of memory when loading a large JSON file. #18

beoran opened this issue Nov 28, 2019 · 4 comments
Assignees

Comments

@beoran
Copy link

beoran commented Nov 28, 2019

I have a 941MB json file I would like to import into eliasdb, but I get an out of error crash. this is because the importer tries to load the data wholesale in stead of incrementally. A way to do incremental loading from a single json file would be great.

@krotik
Copy link
Owner

krotik commented Dec 8, 2019

You are correct. Let me think about that ...

@beoran
Copy link
Author

beoran commented Dec 9, 2019

I think you could switch to a streaming parser for JSON, maybe something like this, if you don't mind the dependency:

https://github.com/francoispqt/gojay

@krotik krotik self-assigned this Dec 11, 2019
@gedw99
Copy link

gedw99 commented May 18, 2021

Yeah I tend to agree about the JSON parser

Consider this one:
github.com/buger/jsonparser

https://golangrepo.com/repo/buger-jsonparser-go-json

there are others also that achieve high throughput but support code gen as well

they also compile to wasm which may be useful overall

@mladkau
Copy link
Contributor

mladkau commented May 19, 2021

I think what would also work here is newline delimited JSON data like BigQuery is using ...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants