Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak if stream is not explicitly closed #1117

Open
hmaarrfk opened this issue Mar 30, 2023 · 13 comments
Open

Memory leak if stream is not explicitly closed #1117

hmaarrfk opened this issue Mar 30, 2023 · 13 comments
Labels

Comments

@hmaarrfk
Copy link
Contributor

hmaarrfk commented Mar 30, 2023

Overview

I think there is a memory leak that occurs if you don't explicitly close a container or stream.

WyattBlue: Stream only. Other leaks have been fixed in 12.x

I'm still trying to drill the problem down, but I think I have a minimum reproducing example that I think is worthwhile to share at this stage.

WyattBlue: Cleaned up for my convenience

It seems that __dealloc__ isn't called as expected maybe??? https://github.com/PyAV-Org/PyAV/blob/main/av/container/input.pyx#L88

WyattBlue: No

import os

import av
import numpy as np
import psutil
from matplotlib import pyplot as plt
from tqdm import tqdm


def main():
    data = np.ones((256, 256, 3), dtype="uint8")

    with av.open("video.mp4", "w") as container:
        stream = container.add_stream("libopenh264", rate=30)
        stream.height = data.shape[0]
        stream.width = data.shape[1]
        stream.pix_fmt = "yuv420p"

        for j in tqdm(range(100), leave=False):
            frame = av.VideoFrame.from_ndarray(data)
            for packet in stream.encode(frame):
                container.mux(packet)
        stream.close()

    virtual_memory = []
    memory_used = []

    for j in tqdm(range(10000)):
        with av.open("video.mp4", "r") as container:
            stream = container.streams.video[0]
            for packet in container.demux(stream):
                stream.decode(packet)

            process = psutil.Process(os.getpid())
            memory_info = process.memory_info()
            memory_used.append(memory_info.rss)
            virtual_memory.append(memory_info.vms)
                
            # Explicitly call close to reduce the memory leak.
            # stream.close()

    m_used = np.asarray(memory_used)
    v_used = np.asarray(virtual_memory)
    plt.loglog(np.arange(1, len(m_used)), m_used[1:] - m_used[0], label="Memory Used")
    plt.ylabel("Memory Usage (Bytes)")
    plt.xlabel("Iteration (count)")
    plt.legend()
    plt.show()


if __name__ == "__main__":
    main()

image

Expected behavior

That the memory be cleared. If I add the close calls to the loop I get.
image

Versions

  • OS: Ubuntu 22.10 + conda + conda-forge
  • PyAV runtime:
python -m av --version
PyAV v10.0.0.post2
library configuration: --prefix=/home/mark/mambaforge/envs/mcam_dev --cc=/home/conda/feedstock_root/build_artifacts/ffmpeg_1674566195805/_build_env/bin/x86_64-conda-linux-gnu-cc --cxx=/home/conda/feedstock_root/build_artifacts/ffmpeg_1674566195805/_build_env/bin/x86_64-conda-linux-gnu-c++ --nm=/home/conda/feedstock_root/build_artifacts/ffmpeg_1674566195805/_build_env/bin/x86_64-conda-linux-gnu-nm --ar=/home/conda/feedstock_root/build_artifacts/ffmpeg_1674566195805/_build_env/bin/x86_64-conda-linux-gnu-ar --disable-doc --disable-openssl --enable-demuxer=dash --enable-hardcoded-tables --enable-libfreetype --enable-libfontconfig --enable-libopenh264 --enable-gnutls --enable-libmp3lame --enable-libvpx --enable-pthreads --enable-vaapi --disable-gpl --enable-libaom --enable-libsvtav1 --enable-libxml2 --enable-pic --enable-shared --disable-static --enable-version3 --enable-zlib --enable-libopus --pkg-config=/home/conda/feedstock_root/build_artifacts/ffmpeg_1674566195805/_build_env/bin/pkg-config
library license: LGPL version 3 or later
libavcodec     59. 37.100
libavdevice    59.  7.100
libavfilter     8. 44.100
libavformat    59. 27.100
libavutil      57. 28.100
libswresample   4.  7.100
libswscale      6.  7.100
  • FFmpeg:
ffmpeg version 5.1.2 Copyright (c) 2000-2022 the FFmpeg developers
built with gcc 11.3.0 (conda-forge gcc 11.3.0-19)
configuration: --prefix=/home/conda/feedstock_root/build_artifacts/ffmpeg_1674566195805/_h_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_plac --cc=/home/conda/feedstock_root/build_artifacts/ffmpeg_1674566195805/_build_env/bin/x86_64-conda-linux-gnu-cc --cxx=/home/conda/feedstock_root/build_artifacts/ffmpeg_1674566195805/_build_env/bin/x86_64-conda-linux-gnu-c++ --nm=/home/conda/feedstock_root/build_artifacts/ffmpeg_1674566195805/_build_env/bin/x86_64-conda-linux-gnu-nm --ar=/home/conda/feedstock_root/build_artifacts/ffmpeg_1674566195805/_build_env/bin/x86_64-conda-linux-gnu-ar --disable-doc --disable-openssl --enable-demuxer=dash --enable-hardcoded-tables --enable-libfreetype --enable-libfontconfig --enable-libopenh264 --enable-gnutls --enable-libmp3lame --enable-libvpx --enable-pthreads --enable-vaapi --disable-gpl --enable-libaom --enable-libsvtav1 --enable-libxml2 --enable-pic --enable-shared --disable-static --enable-version3 --enable-zlib --enable-libopus --pkg-config=/home/conda/feedstock_root/build_artifacts/ffmpeg_1674566195805/_build_env/bin/pkg-config
libavutil      57. 28.100 / 57. 28.100
libavcodec     59. 37.100 / 59. 37.100
libavformat    59. 27.100 / 59. 27.100
libavdevice    59.  7.100 / 59.  7.100
libavfilter     8. 44.100 /  8. 44.100
libswscale      6.  7.100 /  6.  7.100
libswresample   4.  7.100 /  4.  7.100
@hmaarrfk hmaarrfk added the build label Mar 30, 2023
@github-actions

This comment was marked as spam.

@github-actions github-actions bot added the stale label Jul 29, 2023
@hmaarrfk

This comment was marked as spam.

@github-actions github-actions bot removed the stale label Jul 30, 2023
@bfreskura
Copy link

bfreskura commented Aug 1, 2023

hi @hmaarrfk

Have you found any other solution besides explicitly closing containers and streams?

This is my current code that encodes and decodes from the list of numpy array images (imgs_padded). It still produces memory leaks:

 with io.BytesIO() as buf:
            with av.open(buf, "w", format=container_string) as container:
                stream = container.add_stream(codec, rate=rate, options=options)
                stream.height = imgs[0].shape[0]
                stream.width = imgs[0].shape[1]
                stream.pix_fmt = pixel_fmt

                for img in imgs_padded:
                    frame = av.VideoFrame.from_ndarray(img, format="rgb24")
                    frame.pict_type = "NONE"
                    for packet in stream.encode(frame):
                        container.mux(packet)

                # Flush stream
                for packet in stream.encode():
                    container.mux(packet)

                stream.close()

            outputs = []
            with av.open(buf, "r", format=container_string) as video:
                for i, frame in enumerate(video.decode(video=0)):
                    if sequence_length <= i < sequence_length * 2:
                        outputs.append(frame.to_rgb().to_ndarray().astype(np.uint8))

                    if i >= sequence_length * 2:
                        break

                video.streams.video[0].close()

@hmaarrfk
Copy link
Contributor Author

hmaarrfk commented Aug 1, 2023

no, i just explicitly call close.

@bfreskura
Copy link

Ok, thanks.

I later found out the memory leak only occurs if I use the libx265 codec. Everything is ok when using libx264, mpeg2video, mpeg1video, libvpx-vp9.

@meakbiyik
Copy link

meakbiyik commented Aug 18, 2023

I can also reproduce this, without stream.close() the memory leaks.

It seems to happen when I process the frames in a different process, and stream.close causes everything to hang if I use more than 1 processes. Multiple bugs bundled into one 😅

@RoyaltyLJW
Copy link

hi, @meakbiyik
Have you found any solution to deal with the problem?
I find that if i break before all the frame extracted, stream.close causes everything to hang

@meakbiyik
Copy link

Hey @RoyaltyLJW, I still use stream.close(), and I was able to fix the deadlock issue by setting the environment variable PYAV_LOGGING=off. Here's that bug: #751

@RoyaltyLJW
Copy link

@meakbiyik Thanks a lot. It fix my deadlock issue

This comment was marked as spam.

@github-actions github-actions bot added the stale label Jan 21, 2024
@hmaarrfk
Copy link
Contributor Author

Not stale

@watzmonium
Copy link

I also experienced the same issue - when deployed in Docker, not closing the stream causes memory instability. @hmaarrfk you saved my life with this extremely niche bug report.

@WyattBlue WyattBlue changed the title Memory leak in if mp4 container not explicitely closed Memory leak if stream is not explicitly closed Jul 28, 2024
@WyattBlue WyattBlue added bug and removed build labels Jul 28, 2024
@zhiltsov-max
Copy link

zhiltsov-max commented Aug 1, 2024

Hi, I just encountered a similar problem on repeated video decoding. It seems that adding stream.codec_context.close() helps. Inspired by this SO answer.

The code I'm using:

from contextlib import closing
import av # v9.2.0 in my case


def iterate_frames(path: str, *, with_threads: bool = False):
    with av.open(path) as container:
        stream = container.streams.video[0]

        if with_threads:
            stream.thread_type = 'AUTO'

        for packet in container.demux(stream):
            for frame in packet.decode():
                yield frame

        # stream.codec_context.close() # seems to be the fix

while True:
    for frame in iterate_frames("myvideo.mp4", with_threads=True):
        pass

I still can see slight memory increase after many iterations, but it's fluctuating in the range of several MB, while without the fixing line the memory consumption rockets up into gigabytes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

7 participants