-
Notifications
You must be signed in to change notification settings - Fork 686
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Java heap space error when writing > 2 GB file content as Blob into Oracle Table #1617
Comments
Looking at the code, I think Exposed doesn't handle LOBs that well at the moment: Exposed/exposed-core/src/main/kotlin/org/jetbrains/exposed/sql/statements/api/ExposedBlob.kt Line 3 in 98bdca8
Here the Exposed Blob wrapper will create a copy of LOB in the memory. Now, under the hood, when writing to the database it actually uses
What you can try and do is to use raw statements and see if that solves your issue. transaction(db) {
exec("insert into FILE_DOWNLOAD (FILE_CONTENT) values (?)",
listOf(BlobColumnType() to inStream)
)
} If it does help, then I'd be happy to look into adding support of InputStreams for the |
Thanks for the suggestion. I tried using raw statements, though I am not seeing any error, when I check the Oracle database I only see |
Are you sure that your input stream is not-null? |
Yes I could confirm the remote file has 5GB content. Also I get the Java Heap error when trying to readBytes from the input stream. |
Also, when I tried for smaller sized file I only see null value in DB. But if I do |
That's why you'll get OutOfMemory for larger files. But Two things to check:
|
…racle Table #1617 / SQLServer doesn't support read from stream after steatment is closed
It might become better with the next release. |
@harinivas-ganapathy , please check with Exposed 0.41.1 |
Tried with Exposed 0.41.1 After replacing to This is same behavior for 1GB file and 2 GB file. However I tried loading a local file of 1GB, 2.8 GB and 5 GB files into Blob column |
Also I noticed the |
But I was able to write the entire 5GB of data back into another local file from |
…racle Table #1617 / Fix for non-repeatable streams
Is this issue resolved thanks to commit 367babc? |
Hello @harinivas-ganapathy, could you please check if this issue is resolved with commit 367babc? |
I have scenario to read a remote file of >2 GB file into an Oracle Blob Column. When ever the remote file size is > 1.2 GB I am am getting Java heap space error.
Below is my implementation? Can you please help?
The text was updated successfully, but these errors were encountered: