Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Utility method loadImage memory leak #1705

Open
and2 opened this issue Nov 18, 2020 · 9 comments
Open

Utility method loadImage memory leak #1705

and2 opened this issue Nov 18, 2020 · 9 comments

Comments

@and2
Copy link

and2 commented Nov 18, 2020

Issue

Calling the utility method loadImage (or just using the Image object for the matter) does not release the memory used.
The same happens with both local and remote images. Loading a 3MB remote jpg in the below example would keep the rss memory at ~400MB and would not be GCed.

Awaiting for the promise to resolve, setting the returned image to null / the source to null makes no difference to the memory level.

Steps to Reproduce

const { loadImage } = require('canvas')

for (let i=1; i<=10; i++) {
  loadImage('path_to_image').then(function(){
    console.log('loaded')
  })
}

setInterval(()=>{
  if (global.gc) { global.gc() }
  console.log('rss: ' + process.memoryUsage().rss / (1024 * 1024) + 'MB')
}, 1000)

Your Environment

  • Version of node-canvas: 2.6.1 built from source
  • Environment: node 12.19.1 & node 14.15.1 on MacOS 10.13.6 & Heroku 18 stack (Ubuntu 18.04)
@and2
Copy link
Author

and2 commented Dec 2, 2020

I've still not managed to find the source of this and wondering if it's at C level where the memory is not being released? Is anyone else experiencing this issue?
https://imgur.com/a/Ar1dNdz

@Keylekan
Copy link

Keylekan commented Dec 3, 2020

I didn't tested it at a low level as you did, but I have an API and after the call that uses the canvas library, the memory doesn't seem to be released.

@and2
Copy link
Author

and2 commented Dec 3, 2020

Good to know I'm not the only one getting this.

Looks like the leak occurs after calling SetSource in lib/image.js. Think the native code still keeps a reference to the image somehow, even after "cleanup" (setting onload / onerror to null which was a fix from a couple of years ago).
Temporary solution that might work in the interim whilst we continue to look for a fix is to set the img src to a single pixel in cleanup. It still leaks but is significantly less than keeping the original image in memory.

function loadImage (src) {
  return new Promise((resolve, reject) => {
    const image = new Image()

    function cleanup () {
      image.onload = null
      image.onerror = null
      image.src = 'data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAADUlEQVR42mP8/5+hHgAHggJ/PchI7wAAAABJRU5ErkJggg==';
    }

    image.onload = () => { cleanup(); resolve(image) }
    image.onerror = (err) => { cleanup(); reject(err) }

    image.src = src
  })
}

@and2
Copy link
Author

and2 commented Dec 4, 2020

Have also attempted to expose a RemoveSource function from src/Image.cc and call via bindings when the src is set to null. Again reduces the leak, but it's still significant so think the issue might be in Image::clearData()

NAN_METHOD(Image::RemoveSource){
  Image *img = Nan::ObjectWrap::Unwrap<Image>(info.This());
  img->clearData();
}

and in lib/image.js

set(val) {
    if (!val) {
      RemoveSource.call(this)
    }
   ...
}

@zbjornson
Copy link
Collaborator

I tried to reproduce this this weekend, but didn't see any signs of a memory leak. @and2 the code in your OP prints this for me:

> node --expose-gc .\1705.js
loaded
loaded
loaded
loaded
loaded
loaded
loaded
loaded
loaded
loaded
rss: 28.16796875MB
rss: 28.19140625MB
rss: 27.95703125MB
rss: 27.19140625MB
rss: 26.26953125MB

That's with a 154 KB test image, but I wouldn't expect the 3 MB difference in image size to account for another 375 MB in RSS. I also tried increasing the loop bound to 1000 and got the same RSS reading.

I can look more if you can provide some more guidance on how to reproduce.

@and2
Copy link
Author

and2 commented Dec 14, 2020

Thanks for looking into this, @zbjornson. This has been bugging me for weeks now and have not managed to find a direct fix. We're still getting leaks, both locally and on Heroku and my guess would either be an issue within the installed dependencies - maybe with Cairo cairo_surface_destroy(_surface) or V8 (not) adjusting the memory from Nan::AdjustExternalMemory(-_data_len)

heroku's 18 stack was on 1.15.10 for libcairo2 and the latest is at 1.16.0 (which is what we've got installed locally) with same behaviour.

At this stage it was easier for us to run our canvas task in an external process and terminate it on completion to release memory. Will explore this further over the Christmas break, but looks like it might be something linked to our environment since you couldn't reproduce it...

@xbfld
Copy link

xbfld commented Oct 26, 2021

Good to know I'm not the only one getting this.

Looks like the leak occurs after calling SetSource in lib/image.js. Think the native code still keeps a reference to the image somehow, even after "cleanup" (setting onload / onerror to null which was a fix from a couple of years ago). Temporary solution that might work in the interim whilst we continue to look for a fix is to set the img src to a single pixel in cleanup. It still leaks but is significantly less than keeping the original image in memory.

function loadImage (src) {
  return new Promise((resolve, reject) => {
    const image = new Image()

    function cleanup () {
      image.onload = null
      image.onerror = null
      image.src = 'data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAADUlEQVR42mP8/5+hHgAHggJ/PchI7wAAAABJRU5ErkJggg==';
    }

    image.onload = () => { cleanup(); resolve(image) }
    image.onerror = (err) => { cleanup(); reject(err) }

    image.src = src
  })
}

Many hours of googling through, I'm end up here.
Setting image.src to single pixel was helpful to me. Thanks!

(In my case, memory leak is reproducible on docker(Ubuntu 18) but my mac isn't.
I think an environment has effect for this.)

@dievardump
Copy link

dievardump commented Sep 30, 2023

I'm having huge memory leaks on Docker. Loading a few mb (6mb) of images using loadImage and the docker images goes to 4Gb of memory usage and crashes.

This is still open and probably still affecting people.

edit: sorry, hadn't seen the loadImage() that resets the src, thanks for that

@pranavburnwal
Copy link

+1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants