Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Use C++ OpenGL GPU VideoPipeline again #1836

Merged
merged 10 commits into from
Sep 22, 2023
Merged

Conversation

mrousavy
Copy link
Owner

@mrousavy mrousavy commented Sep 22, 2023

What

This PR consists of 2 parts:

  1. Reverts 4e96eb7 (PR perf: Use ImageWriter instead of OpenGL Pipeline for faster processing #1789) to bring the C++ OpenGL GPU Pipeline back.
  2. Fixes the "initHybrid JNI not found" error by loading the native JNI/C++ library in VideoPipeline.kt.

The original plan was to use an ImageReader -> ImageWriter setup since that:

  • ..seems more efficient (as it just uses Images/HardwareBuffers around instead of doing render passes)
  • ..seems simpler to use (simple queueInputImage call instead of an entire OpenGL pipeline setup)
  • ..is more aligned with how it worked on iOS (CMSampleBuffer just being passed around)
  • ..and uses the native (PRIVATE) pixelFormat. (supports true native and yuv pixelFormats, right now the OpenGL pipeline always operates in RGB)

The problem was, ImageReader is designed for CPU access, and when I pass Images along to the ImageWriter to send them into the MediaRecorder, it just fails. This sucks. I have no idea what I'm doing wrong.

^ if you know anyone that could help here, please let me know. Otherwise we just have to use the OpenGL pipeline

Downsides of this PR/the C++ OpenGL GPU Video Pipeline:

  • Is probably less efficient as a simple ImageReader -> ImageWriter pipeline as we do actual render-passes instead of just moving buffers around
  • Is really hard to use and complex to maintain (tons of C++ code and OpenGL/GPU/Shaders shit)
  • Lots of boilerplate code (fragment shaders, vertex shaders, OpenGL setup, JNI stuff, ..)
  • Does not support native and yuv pixel formats; it crashes when you pass pixelFormat="yuv"!

I think the Android Media APIs are just not as advanced as iOS' Media APIs yet.
We have to settle for the OpenGL pipeline for now, which uses RGB and might be a bit slower than just passing native Images around, but whatever.

I spent way too much time on this already, no other Camera library out there is this advanced. I read through thousands of pages of API reference, guides (not a lot of those), documentation, issues, and mostly: C++ code.

Changes

Tested on

Related issues

…or faster processing (#1789)"

Reverts the commit 4e96eb7 (PR #1789) again to use the C++ OpenGL pipeline.
@vercel
Copy link

vercel bot commented Sep 22, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
react-native-vision-camera ✅ Ready (Inspect) Visit Preview 💬 Add feedback Sep 22, 2023 3:20pm

Currently we don't support YUV or PRIVATE because OpenGL just works in RGB.
@mrousavy
Copy link
Owner Author

I would prefer to move away from OpenGL to instead use something that works similar to how iOS works by just passing GPU buffers around, but it's not that simple. Android.media APIs just aren't as good yet. See this issue ($4.000 bounty!) for more information: #1837

textureExternalOES can do PRIVATE
@mrousavy mrousavy merged commit 9add0eb into main Sep 22, 2023
@mrousavy mrousavy deleted the feat/use-opengl-again branch September 22, 2023 15:22
isaaccolson pushed a commit to isaaccolson/deliveries-mobile that referenced this pull request Oct 30, 2024
1. Reverts mrousavy@4e96eb7 (PR mrousavy#1789) to bring the C++ OpenGL GPU Pipeline back.
2. Fixes the "initHybrid JNI not found" error by loading the native JNI/C++ library in `VideoPipeline.kt`.

This PR has two downsides:

1. `pixelFormat="yuv"` does not work on Android. OpenGL only works in RGB
2. OpenGL rendering is fast, but it has an overhead. I think for Camera -> Video Recording we shouldn't be using an entire OpenGL rendering pipeline.

The original plan was to use something similar to how it works on iOS by just passing GPU buffers around, but the android.media APIs just aren't as advanced yet. `ImageReader`/`ImageWriter` is way too buggy and doesn't really work with `MediaRecorder`/`MediaCodec`.

This sucks, I hope in the future we can use something like `AHardwareBuffer`s.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant