-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JSI Frame Processors (iOS) #2
Conversation
EDITGot it. Finally managed to call Swift code from Objective-C++. Original commentI'm having trouble calling the Swift function from the Objective-C++ file. :( Apparently CocoaPods renames the ObjC Generated Interface Header Name from Mission: I want to call a Swift function (actually just use the If anyone has an idea, please let me know as this is blocking me a lil bit EDIT: Wtf am I doing wrong? |
this is awesome. I don't have time to contribute at the moment, but I wanted to drop in here and cheer you on 🙌 |
🚀🎉 Got it running successfully: Screen.Recording.2021-03-11.at.10.54.51.mov |
This is exciting. Seems like a good abstraction of worklets in general. |
My PR at reanimated was finally merged!: software-mansion/react-native-reanimated@815847e 🎉🎉 |
7676923
to
eb5c6c3
Compare
9123008
to
a322700
Compare
0dc99a1
to
4c1856a
Compare
example/src/App.tsx
Outdated
@@ -213,6 +219,14 @@ export const App: NavigationFunctionComponent = ({ componentId }) => { | |||
console.log('re-rendering camera page without active camera'); | |||
} | |||
|
|||
const frameProcessor = useFrameProcessor((frame) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[tsc] <6133> reported by reviewdog 🐶
'frameProcessor' is declared but its value is never read.
example/src/App.tsx
Outdated
@@ -234,6 +248,12 @@ export const App: NavigationFunctionComponent = ({ componentId }) => { | |||
onError={onError} | |||
enableZoomGesture={false} | |||
animatedProps={cameraAnimatedProps} | |||
frameProcessor={(frame) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[tsc] <7006> reported by reviewdog 🐶
Parameter 'frame' implicitly has an 'any' type.
example/src/App.tsx
Outdated
@@ -213,6 +217,12 @@ export const App: NavigationFunctionComponent = ({ componentId }) => { | |||
console.log('re-rendering camera page without active camera'); | |||
} | |||
|
|||
const frameProcessor = useFrameProcessor((frame) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[tsc] <6133> reported by reviewdog 🐶
'frame' is declared but its value is never read.
iPhone 15 support fix
Update (3.5.2021)
Frame Processors are ready for iOS 🎉. Here's what actually happened:
AVCaptureVideoDataOutput
delegate which uses a customAVAssetWriter
to write the video files (turned out quite complex!), since you cannot use theAVCaptureMovieFileOutput
and aAVCaptureVideoDataOutput
(for frame processing) delegate at the same time.AVCaptureAudioDataOutput
delegate which uses a customAVAssetWriter
for the same reason as abovejsi::Value
)EXC_BAD_ACCESS
errors), and submitted a PR to reanimated to fix those.Right now I have compared RAM and CPU usage between current
main
(ae1dde1) and currentframe-processor
branch, and noticed that with the customAVCaptureVideoDataOutput
delegate the memory usage has increased from 58 MB to 187 MB (even with all the frame-processor code removed). That's a very high increase in memory which is currently blocking me from merging this thing. I have tried debugging this to find out where it comes from, but couldn't figure it out (yet). I've posted this to stackoverflow for now: https://stackoverflow.com/q/67370456/5281431 - if anyone has an idea, lmkIf you want, you can already write frame processor plugins. They're fairly easy to write, as you're probably just using some API anyways, and you just have to forward those calls then.
As for Android, I'll do that in a separate PR. I'll wait for #1960 and #1879 to be merged before, since those PRs change a lot of reanimated's Android infra.
Original comment:
Frame Processors
Implements "frame processors".
Frame Processors are functions created in JS that can be used to process frames from the Camera.
They will be called for each frame the camera sees, and can analyse the current frame using simple JS code. They can call other JS functions, but those have to be workletized (achieved with Reanimated).
Example use-cases include (but are not limited to-):
Other functions (such as AI stuff) have to be either workletized, or implemented natively using JSI (C++).
In other words, they're just like Reanimated worklets, but on a separate JS-Runtime specifically created for the Camera (can be explained as some sort of JS multithreading)
Notes
SharedValue
s (e.g. for updating a QR code's frame which will be displayed), but unlike reanimated worklets, the frame processor will not be called when aSharedValue
changes (it's not a mapper.). I still have to make sure all thread-checkers are correctly set up for this, see my PR Fix Threading issues (SV access from different Thread) software-mansion/react-native-reanimated#1883.Frame
The
Frame
object should represent a single frame from the camera. I'm not entirely sure yet what properties this frame will have, but ideally I want to support:pixels
. An array of pixels this camera sees. This should be in the size of the Preview View, not some huge format like 4k because that would result in each frame object having like 25MB of space. You can read pixels like an array:frame.pixels[0]
->{ r: number, g: number, b: number }
- I am not sure how this will work on the platforms since they don't stream in RGB but rather YUV (Android) or some other color space (iOS).depth
. An array of depth information for each pixel. Not sure if I can embed that into thepixels
array or have it separately. We'll see.Base Plugins
Since at release there are no cool & easy to use frame processor plugins available, I thought of creating base plugins that are always supported. Either in this lib or in a separate package/repo.
Frame
.Maybe I can also use iOS/Android specific APIs for faster execution speed, e.g. Metal or iOS AI tools.
Implementation
HostObject
for the Camera that "installs" the JSI bindings, in our case something like "setFrameProcessorForCamera(viewTag: number, frameProcessor: FrameProcessor)
"frameProcessor
parameter on a separate thread using a Reanimated API. See: Multithreading with worklets software-mansion/react-native-reanimated#1561jsi::Function
.Camera.ts
view where the result gets memoized.Swift <-> C++
Since the library is written in Swift, we need a way to interop with JSI (written in C++).
I've found the following solutions:
extern "C"
block so I can call it from Swift. I have to be careful withUnsafePointer<T>
and deallocating the objects, since that is not done automatically (no ARC)I really wish Swift had direct C++ interop. This incompatibility is making this whole thing really really hard.
Tasks
Following is already done:
roadblock: Use dependency with non-modular-headers in library where I don't need the modular headers CocoaPods/CocoaPods#10472)NativeReanimatedModule
instance to callmakeShareable
("workletize") (roadblock: No idea how to get a TurboModule outside of a TurboModule context.)jsi::HostObject
with custom accessors, group bytes in color format?jsi::HostObject
)Testing
Since this PR required a few refactors and restructures of the react-native-reanimated library, I have created a PR over there: software-mansion/react-native-reanimated#1790. Because that PR is not merged yet, you have to have those changes locally - either install react-native-reanimated directly from that GitHub branch, or install it normally through npm (2.0.0), download the repo from my PR's branch, and drag the
ios
andCommon
files from my PR into yourreact-native-vision-camera/example/node_modules/react-native-reanimated
folder.Android and
.aar
sAndroid is a bit more complicated than iOS because react-native-reanimated is not distributed from Source. I will have to play around with CMake to try and get the react-native-reanimated.aar file (that lives on the end-user's machine) embedded into my library, no idea if that's even possible.
Maybe useful links: