This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
Memory leak when saving and indexing (clouseau) a big document (10MiB) #3688
Labels
You can continue the conversation there. Go to discussion →
Description
Using ibm docker (couchdb+lucene index) based on couchdb 3.1.1 (ibmcom/couchdb3:3.1.1), we are storing a big document into couchdb.
One of the fields in the document contains an array of arrays created from a 10MiB csv. When this document is created container memory usage starts growing until it reaches memory limit for the container (8Gib), or host limit (running in a 16GiB machine), or pocess limit
Javascript fragment in the desgin document indexing this field: joins position 1 of arrays in a variable and then we index it in a single call to index function (built string size is ~ 11MiB). It's the same if we call index function as many times as rows in the array, same error.
To avoid
os_process_error
issue, we tried to increase max memory of couchjs processes withCOUCHDB_QUERY_SERVER_JAVASCRIPT
But it seems it has no effect
Steps to Reproduce
Expected Behaviour
Your Environment
CouchDB version used: 3.1.1
Docker version: Docker version 20.10.7, build f0df350
Container limits:
Couchdb configuration:
Operating system and version: Ubuntu 20.04
Additional Context
The text was updated successfully, but these errors were encountered: