-
-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE] - Combined with depth map to achieve bokeh blur to obs source #86
Comments
Hi @Danjuanlab. Great suggestion. Offline I've been working on a fast hexagonal bokeh blur algorithm, that is getting close to being ready for primetime, and will definitely be included as a new blur effect when I'm happy with it. Unfortunately for real-time blurring, hexagonal bokeh is the only (interesting) straightforward bokeh blur effect, as it can be split into a 2-pass algorithm. I suppose you could do a square/rectangular bokeh as well, though its not as interesting. We might also be able to do circular Bokeh, but it gets a bit more tricky. Using any defined shape is possible, though it would take a modern, higher end GPU (think RTX 3080+). Regarding a depth map, this isn't something I've thought about too much, however if we already have a depth map (e.g.- some other plugin is generating it, from say Kinect data or some AI), and we can pass it into the blur plugin, it would be pretty straightforward to do. |
Thanks for your reply, in most cases, the effect of blurring the background is used for seated green screen live broadcasts. But what I call the bokeh blur that combines the tof sensor and the depth map is mainly used for standing full-body green screen keying live broadcasts, such as Tiktok and instagram platforms. So i want to go one step further for the live broadcast effect at this time, the virtual background of the live broadcast will be a super-realistic video rendered by a 3D scene (without any depth of field), and then render a depth map sequence frame of the camera focus point from front to back. In this way, the distance from the object recognized by the external tof to the camera can be converted into a specific depth map through a simple function. The rendered background video is then combined with the depth map to perform bokeh blur, which will lead to better blending. This is a bit like fixed-camera virtual production technology. I am testing and working hard in this direction~ In addition, I think considering that the bokeh blur algorithm consumes a lot of money, I think it is totally fine to initially stabilize the hexagonal spot, because in actual scenes, such a particularly large bokeh spot requires a particularly large virtual scene or real background. , but this application is still relatively rare. |
Very interesting! And it sounds like something we could incorporate. Do you happen to have any example video, and the corresponding depth-map video? Or even a still shot of each? I'm assuming the depth map is greyscale where the value of white-to-black represents the depth? If you can provide some sample video or audio, I'd love to start playing with it. As an initial test, we could allow the user to provide a depth map to the "tilt-shift" blur algorithm, and use that for the amount of blur. |
https://drive.google.com/file/d/1GARXns-5-9pk1lIuTgOqNtd2k7qvY4fR/view?usp=sharing |
Fantastic. I'll take a look (this might take a couple of weeks, as I'm a bit back-logged at the moment, but this gives me some good test data to work with). |
Bokeh blur is used to simulate the effect of camera aperture blur in real scenes. In green screen type live broadcasts, it can map one of a set of depth maps based on the distance of the external TOF sensor, and then combine this depth Image to add bokeh blur to the keyed background to achieve a more realistic composite effect.
Currently, I use the glfx.js library to add lens blur to videos in the browser. On the integrated graphics card of Intel 8400 CPU, the original video of about 1080p 30p only has 6-7 frames per second after processing.
So is it possible to implement this depth map + bokeh blur in real time? If it can be realized, it will be an improvement in green screen live broadcast .
this is the frames demo:https://drive.google.com/file/d/1kC__OF7FJfEUdqvfR5qis4hmy5Q2_Zqv/view?usp=sharing
The text was updated successfully, but these errors were encountered: