Skip to content

Commit

Permalink
feature: Frame Processors (iOS) (#2)
Browse files Browse the repository at this point in the history
* Clean up Frame Processor

* Create FrameProcessorHolder

* Create FrameProcessorDelegate in ObjC++

* Move frame processor to FrameProcessorDelegate

* Decorate runtime, check for null

* Update FrameProcessorDelegate.mm

* Cleanup FrameProcessorBindings.mm

* Fix RuntimeDecorator.h import

* Update FrameProcessorDelegate.mm

* "React" -> "React Helper" to avoid confusion

* Rename folders again

* Fix podspec flattening a lot of headers, causing REA nameclash

* Fix header imports to avoid REA naming collision

* Lazily initialize jsi::Runtime on DispatchQueue

* Install frame processor bindings from Swift

* First try to call jsi::Function (frame processor) 👀

* Call viewForReactTag on RCT main thread

* Fix bridge accessing

* Add more logs

* Update CameraViewManager.swift

* Add more TODOs

* Re-indent .cpp files

* Fix RCTTurboModule import podspec

* Remove unnecessary include check for swift umbrella header

* Merge branch 'main' into frame-processors

* Docs: use static width for images (283)

* Create validate-cpp.yml

* Update a lot of packages to latest

* Set SWIFT_VERSION to 5.2 in podspec

* Create clean.sh

* Delete unused C++ files

* podspec: Remove CLANG_CXX_LANGUAGE_STANDARD and OTHER_CFLAGS

* Update pod lockfiles

* Regenerate lockfiles

* Remove IOSLogger

* Use NSLog

* Create FrameProcessorManager (inherits from REA RuntimeManager)

* Create reanimated::RuntimeManager shared_ptr

* Re-integrate pods

* Add react-native-reanimated >=2 peerDependency

* Add metro-config

* blacklist -> exclusionList

* Try to call worklet

* Fix jsi::Value* initializer

* Call ShareableValue::adapt (makeShareable) with React/JS Runtime

* Add null-checks

* Lift runtime manager creation out of delegate, into bindings

* Remove debug statement

* Make RuntimeManager unique_ptr

* Set _FRAME_PROCESSOR

* Extract convertJSIFunctionToFrameProcessorCallback

* Print frame

* Merge branch 'main' into frame-processors

* Reformat Swift code

* Install reanimated from npm again

* Re-integrate Pods

* Dependabot: Also scan example/ and docs/

* Update validate-cpp.yml

* Create FrameProcessorUtils

* Create Frame.h

* Abstract HostObject creation away

* Fix types

* Fix frame processor call

* Add todo

* Update lockfiles

* Add C++ contributing instructions

* Update CONTRIBUTING.md

* Add android/src/main/cpp to cpplint

* Update cpplint.sh

* Fix a few cpplint errors

* Fix globals

* Fix a few more cpplint errors

* Update App.tsx

* Update AndroidLogger.cpp

* Format

* Fix cpplint script (check-cpp)

* Try to simplify frame processor

* y

* Update FrameProcessorUtils.mm

* Update FrameProcessorBindings.mm

* Update CameraView.swift

* Update CameraViewManager.m

* Restructure everything

* fix

* Fix `@objc` export (make public)

* Refactor installFrameProcessorBindings into FrameProcessorRuntimeManager

* Add swift RCTBridge.runOnJS helper

* Fix run(onJS)

* Add pragma once

* Add `&self` to lambda

* Update FrameProcessorRuntimeManager.mm

* reorder imports

* Fix imports

* forward declare

* Rename extension

* Destroy buffer after execution

* Add FrameProcessorPluginRegistry base

* Merge branch 'main' into frame-processors

* Add frameProcessor to types

* Update Camera.tsx

* Fix rebase merge

* Remove movieOutput

* Use `useFrameProcessor`

* Fix bad merge

* Add additional ESLint rules

* Update lockfiles

* Update CameraViewManager.m

* Add support for V8 runtime

* Add frame processor plugins API

* Print plugin invoke

* Fix React Utils in podspec

* Fix runOnJS swift name

* Remove invalid redecl of `captureSession`

* Use REA 2.1.0 which includes all my big PRs 🎉

* Update validate-cpp.yml

* Update Podfile.lock

* Remove Flipper

* Fix dereferencing

* Capture `self` by value. Fucking hell, what a dumb mistake.

* Override a few HostObject functions

* Expose isReady, width, height, bytesPerRow and planesCount

* use hook again

* Expose property names

* FrameProcessor -> Frame

* Update CameraView+RecordVideo.swift

* Add Swift support for Frame Processors Plugins

* Add macros for plugin installation

* Add ObjC frame processor plugin

* Correctly install frame processor plugins

* Don't require custom name for macro

* Check if plugin already exists

* Implement QR Code Frame Processor Plugin in Swift

* Adjust ObjC style frame processor macro

* optimize

* Add `frameProcessorFrameDropRate`

* Fix types

* Only log once

* Log if it executes slowly

* Implement `frameProcessorFps`

* Implement manual encoded video recordings

* Use recommended video settings

* Add fileType types

* Ignore if input is not ready for media data

* Add completion handler

* Add audio buffer sampling

* Init only for video frame

* use AVAssetWriterInputPixelBufferAdaptor

* Remove AVAssetWriterInputPixelBufferAdaptor

* Rotate VideoWriter

* Always assume portrait orientation

* Update RecordingSession.swift

* Use a separate Queue for Audio

* Format Swift

* Update CameraView+RecordVideo.swift

* Use `videoQueue` instead of `cameraQueue`

* Move example plugins to example app

* Fix hardcoded name in plugin macro

* QRFrame... -> QRCodeFrame...

* Update FrameProcessorPlugin.h

* Add example frame processors to JS base

* Update QRCodeFrameProcessorPluginSwift.m

* Add docs to create FP Plugins

* Update FRAME_PROCESSORS_CREATE.mdx

* Update FRAME_PROCESSORS_CREATE.mdx

* Use `AVAssetWriterInputPixelBufferAdaptor` for efficient pixel buffer recycling

* Add customizable `pixelFormat`

* Use native format if available

* Update project.pbxproj

* Set video width and height as source-pixel-buffer attributes

* Catch

* Update App.tsx

* Don't explicitly set video dimensions, let CVPixelBufferPool handle it

* Add a few logs

* Cleanup

* Update CameraView+RecordVideo.swift

* Eagerly initialize asset writer to fix stutter at first frame

* Use `cameraQueue` DispatchQueue to not block CaptureDataOutputDelegate

* Fix duration calculation

* cleanup

* Cleanup

* Swiftformat

* Return available video codecs

* Only show frame drop notification for video output

* Remove photo and video codec functionality

It was too much complexity and probably never used anyways.

* Revert all android related changes for now

* Cleanup

* Remove unused header

* Update AVAssetWriter.Status+descriptor.swift

* Only call Frame Processor for Video Frames

* Fix `if`

* Add support for Frame Processor plugin parameters/arguments

* Fix arg support

* Move to JSIUtils.mm

* Update JSIUtils.h

* Update FRAME_PROCESSORS_CREATE.mdx

* Update FRAME_PROCESSORS_CREATE.mdx

* Upgrade packages for docs/

* fix docs

* Rename

* highlight lines

* docs

* community plugins

* Update FRAME_PROCESSOR_CREATE_FINAL.mdx

* Update FRAME_PROCESSOR_PLUGIN_LIST.mdx

* Update FRAME_PROCESSOR_PLUGIN_LIST.mdx

* Update dependencies (1/2)

* Update dependencies (2/2)

* Update Gemfile.lock

* add FP docs

* Update README.md

* Make `lastFrameProcessor` private

* add `frameProcessor` docs

* fix docs

* adjust docs

* Update DEVICES.mdx

* fix

* s

* Add logs demo

* add metro restart note

* Update FRAME_PROCESSOR_CREATE_PLUGIN_IOS.mdx

* Mirror video device

* Update AVCaptureVideoDataOutput+mirror.swift

* Create .swift-version

* Enable whole module optimization

* Fix recording mirrored video

* Swift format

* Clean dictionary on `markInvalid`

* Fix cleanup

* Add docs for disabling frame processors

* Update project.pbxproj

* Revert "Update project.pbxproj"

This reverts commit e67861e.

* Log frame drop reason

* Format

* add more samples

* Add clang-format

* also check .mm

* Revert "also check .mm"

This reverts commit 8b9d5e2.

* Revert "Add clang-format"

This reverts commit 7643ac8.

* Use `kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange` as default

* Read matching video attributes from videoSettings

* Add TODO

* Swiftformat

* Conditionally disable frame processors

* Assert if trying to use frame processors when disabled

* Add frame-processors demo gif

* Allow disabling frame processors via `VISION_CAMERA_DISABLE_FRAME_PROCESSORS`

* Update FrameProcessorRuntimeManager.mm

* Update FRAME_PROCESSORS.mdx

* Update project.pbxproj

* Update FRAME_PROCESSORS_CREATE_OVERVIEW.mdx
  • Loading branch information
mrousavy authored May 6, 2021
1 parent 77b3d78 commit b6a67d5
Show file tree
Hide file tree
Showing 100 changed files with 4,746 additions and 3,707 deletions.
3 changes: 3 additions & 0 deletions .eslintrc.js
Original file line number Diff line number Diff line change
Expand Up @@ -92,4 +92,7 @@ module.exports = {
env: {
node: true,
},
globals: {
_log: 'readonly',
},
};
31 changes: 31 additions & 0 deletions .github/workflows/validate-cpp.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
name: Validate C++

on:
push:
branches:
- main
paths:
- '.github/workflows/validate-cpp.yml'
- 'cpp/**'
pull_request:
paths:
- '.github/workflows/validate-cpp.yml'
- 'cpp/**'

jobs:
lint:
name: cpplint
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: reviewdog/action-cpplint@master
with:
github_token: ${{ secrets.github_token }}
reporter: github-pr-review
flags: --linelength=230
targets: --recursive cpp android/src/main/cpp
filter: "-legal/copyright\
,-readability/todo\
,-build/namespaces\
,-whitespace/comments\
"
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -64,3 +64,7 @@ package-lock.json
# TypeDoc/Docusaurus stuff
docs/docs/api
docs/typedoc-sidebar.js

# External native build folder generated in Android Studio 2.2 and later
.externalNativeBuild
.cxx/
12 changes: 12 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@
yarn bootstrap
```

Read the READMEs in [`android/`](android/README.md), [`ios/`](ios/README.md), and [`cpp/`](cpp/README.md) for a quick overview of the development workflow.

> You can also open VisionCamera in [a quick online editor (github1s)](https://github1s.com/cuvent/react-native-vision-camera)
### iOS
Expand All @@ -22,12 +24,22 @@
3. Select your device in the devices drop-down
4. Hit run

> Run `yarn check-ios` to validate codestyle
### Android

1. Open the `example/android/` folder with Android Studio
2. Select your device in the devices drop-down
3. Hit run

> Run `yarn check-android` to validate codestyle
### C++

The C++ codebase is shared between Android and iOS. This means you can make changes to those files in either the Android example or the iOS example, but make sure to test changes on both platforms.

> Run `yarn check-cpp` to validate codestyle
## Committing

We love to keep our codebases clean. To achieve that, we use linters and formatters which output errors when something isn't formatted the way we like it to be.
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@
* Photo, Video and Snapshot capture
* Customizable devices and multi-cameras (smoothly zoom out to "fish-eye" camera)
* Customizable FPS
* Frame Processors (JS worklets to run QR-Code scanning, facial recognition, AI object detection, realtime video chats and more) (**Work in progress: [#2](https://github.com/cuvent/react-native-vision-camera/pull/2)**)
* Frame Processors (JS worklets to run QR-Code scanning, facial recognition, AI object detection, realtime video chats and more)
* Smooth zooming (Reanimated)
* Fast pause and resume
* HDR & Night modes
Expand Down
29 changes: 28 additions & 1 deletion VisionCamera.podspec
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,34 @@ Pod::Spec.new do |s|
s.platforms = { :ios => "11.0" }
s.source = { :git => "https://github.com/cuvent/react-native-vision-camera.git", :tag => "#{s.version}" }

s.source_files = "ios/**/*.{h,m,mm,swift}"
s.pod_target_xcconfig = {
"DEFINES_MODULE" => "YES",
"USE_HEADERMAP" => "YES",
"HEADER_SEARCH_PATHS" => "\"$(PODS_TARGET_SRCROOT)/ReactCommon\" \"$(PODS_TARGET_SRCROOT)\" \"$(PODS_ROOT)/Headers/Private/React-Core\" "
}
s.requires_arc = true

# All source files that should be publicly visible
# Note how this does not include headers, since those can nameclash.
s.source_files = [
"ios/**/*.{m,mm,swift}",
"ios/CameraBridge.h",
"ios/Frame Processor/FrameProcessorCallback.h",
"ios/Frame Processor/FrameProcessorRuntimeManager.h",
"ios/Frame Processor/FrameProcessorPluginRegistry.h",
"ios/Frame Processor/FrameProcessorPlugin.h",
"ios/React Utils/RCTBridge+runOnJS.h",
"cpp/**/*.{cpp}",
]
# Any private headers that are not globally unique should be mentioned here.
# Otherwise there will be a nameclash, since CocoaPods flattens out any header directories
# See https://github.com/firebase/firebase-ios-sdk/issues/4035 for more details.
s.preserve_paths = [
"cpp/**/*.h",
"ios/**/*.h"
]

s.dependency "React-callinvoker"
s.dependency "React"
s.dependency "React-Core"
end
10 changes: 0 additions & 10 deletions android/src/main/java/com/mrousavy/camera/CameraView.kt
Original file line number Diff line number Diff line change
Expand Up @@ -336,16 +336,6 @@ class CameraView(context: Context) : FrameLayout(context), LifecycleOwner {
}
}

fun getAvailablePhotoCodecs(): WritableArray {
// TODO
return Arguments.createArray()
}

fun getAvailableVideoCodecs(): WritableArray {
// TODO
return Arguments.createArray()
}

override fun onLayout(changed: Boolean, left: Int, top: Int, right: Int, bottom: Int) {
super.onLayout(changed, left, top, right, bottom)
Log.i(TAG, "onLayout($changed, $left, $top, $right, $bottom) was called! (Width: $width, Height: $height)")
Expand Down
16 changes: 0 additions & 16 deletions android/src/main/java/com/mrousavy/camera/CameraViewModule.kt
Original file line number Diff line number Diff line change
Expand Up @@ -91,22 +91,6 @@ class CameraViewModule(reactContext: ReactApplicationContext) : ReactContextBase
}
}

@ReactMethod
fun getAvailableVideoCodecs(viewTag: Int, promise: Promise) {
withPromise(promise) {
val view = findCameraView(viewTag)
view.getAvailableVideoCodecs()
}
}

@ReactMethod
fun getAvailablePhotoCodecs(viewTag: Int, promise: Promise) {
withPromise(promise) {
val view = findCameraView(viewTag)
view.getAvailablePhotoCodecs()
}
}

// TODO: This uses the Camera2 API to list all characteristics of a camera device and therefore doesn't work with Camera1. Find a way to use CameraX for this
// https://issuetracker.google.com/issues/179925896
@ReactMethod
Expand Down
31 changes: 31 additions & 0 deletions cpp/MakeJSIRuntime.h
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
#pragma once

#include <jsi/jsi.h>
#include <memory>

#if __has_include(<hermes/hermes.h>)
// Hermes (https://hermesengine.dev)
#include <hermes/hermes.h>
#elif __has_include(<v8runtime/V8RuntimeFactory.h>)
// V8 (https://github.com/Kudo/react-native-v8)
#include <v8runtime/V8RuntimeFactory.h>
#else
// JSC
#include <jsi/JSCRuntime.h>
#endif

using namespace facebook;

namespace vision {

static std::unique_ptr<jsi::Runtime> makeJSIRuntime() {
#if __has_include(<hermes/hermes.h>)
return facebook::hermes::makeHermesRuntime();
#elif __has_include(<v8runtime/V8RuntimeFactory.h>)
return facebook::createV8Runtime("");
#else
return facebook::jsc::makeJSCRuntime();
#endif
}

} // namespace vision
26 changes: 26 additions & 0 deletions cpp/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# cpp

This folder contains the Shared C++ code for react-native-vision-camera.

## Prerequesites

1. For Android, download the [NDK and build tools](https://developer.android.com/studio/projects/add-native-code#download-ndk)
2. For iOS, Xcode will be enough.
3. Install cpplint
```sh
brew install cpplint
```

## Getting Started

It is recommended that you work on the code using the Example project (`example/android/` or `example/ios/VisionCameraExample.xcworkspace`), since that always includes the React Native header files, plus you can easily test changes that way.

You can however still edit the library project here by opening this folder with any C++ editor.

## Committing

Before committing, make sure that you're not violating the cpplint codestyles. To do that, run the following command:
```bash
yarn check-cpp
```
20 changes: 20 additions & 0 deletions cpp/SpeedChecker.h
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
#pragma once

#include "Logger.h"
#include <string>

namespace vision {

class SpeedChecker {
public:
static void checkSpeed(std::string tag, std::function<void()> fun) {
auto start = std::chrono::system_clock::now();
fun();
auto end = std::chrono::system_clock::now();
std::chrono::duration<double> elapsed_seconds = end-start;
tag += " " + std::to_string(elapsed_seconds.count()) + "s";
Logger::log(tag.c_str());
}
};

} // namespace vision
17 changes: 11 additions & 6 deletions docs/docs/guides/ANIMATED.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -17,14 +17,16 @@ import useBaseUrl from '@docusaurus/useBaseUrl';

Often you'd want to animate specific props in the Camera. For example, if you'd want to create a custom zoom gesture, you can smoothly animate the Camera's `zoom` property.

Note: The `<Camera>` component does provide a natively implemented zoom gesture which you can enable with the `enableZoomGesture={true}` prop. This does not require any additional work, but if you want to setup a custom gesture, such as the one in Snapchat or Instagram where you move up your finger while recording, continue reading.
The `<Camera>` component already provides a natively implemented zoom gesture which you can enable with the [`enableZoomGesture`](/docs/api/interfaces/cameraprops.cameraprops-1#enablezoomgesture) prop. This does not require any additional work, but if you want to setup a custom gesture, such as the one in Snapchat or Instagram where you move up your finger while recording, continue reading.

### Installing reanimated
### Animation libraries

The following example uses [react-native-reanimated](https://github.com/software-mansion/react-native-reanimated) (v2) to animate the `zoom` property. Head over to their [Installation guide](https://docs.swmansion.com/react-native-reanimated/docs/installation) to install Reanimated if you haven't already.
While you can use any animation library to animate the `zoom` property (or use no animation library at all) it is recommended to use [react-native-reanimated](https://github.com/software-mansion/react-native-reanimated) (v2) to achieve best performance. Head over to their [Installation guide](https://docs.swmansion.com/react-native-reanimated/docs/installation) to install Reanimated if you haven't already.

### Implementation

The following example implements a button which smoothly zooms to a random value using [react-native-reanimated](https://github.com/software-mansion/react-native-reanimated):

```tsx
import Reanimated, {
useAnimatedProps,
Expand Down Expand Up @@ -72,13 +74,16 @@ export function App() {

### Explanation

1. The `Camera` is converted to a reanimated Camera using `Reanimated.createAnimatedComponent`
1. The `Camera` was made animatable using `Reanimated.createAnimatedComponent`
2. The `zoom` property is added to the whitelisted native props to make it animatable.
> Note that this might not be needed in the future, see: [reanimated#1409](https://github.com/software-mansion/react-native-reanimated/pull/1409)
3. Using [`useSharedValue`](https://docs.swmansion.com/react-native-reanimated/docs/api/useSharedValue), we're creating a shared value that holds the `zoom` property.
4. Using the [`useAnimatedProps`](https://docs.swmansion.com/react-native-reanimated/docs/api/useAnimatedProps) hook, we apply the shared value to the animated props.
3. Using [`useSharedValue`](https://docs.swmansion.com/react-native-reanimated/docs/api/useSharedValue), we're creating a shared value that holds the value for the `zoom` property.
4. Using the [`useAnimatedProps`](https://docs.swmansion.com/react-native-reanimated/docs/api/useAnimatedProps) hook, we apply the shared value to Camera's `zoom` property.
5. We apply the animated props to the `ReanimatedCamera` component's `animatedProps` property.

### Logarithmic scale

A Camera's `zoom` property is represented in a **logarithmic scale**. That means, increasing from `0` to `0.1` will appear to be a much larger offset than increasing from `0.9` to `1`. If you want to implement a zoom gesture (`<PinchGestureHandler>`, `<PanGestureHandler>`), try to flatten the `zoom` property to a **linear scale** by raising it **exponentially**. (`zoom.value ** 2`)

<br />

Expand Down
2 changes: 1 addition & 1 deletion docs/docs/guides/DEVICES.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -126,7 +126,7 @@ function App() {
```

:::info
Note: If you don't care about fast resume times you can also fully unmount the `<Camera>` view instead, which will use a lot less memory (RAM).
If you don't care about fast resume times you can also fully unmount the `<Camera>` view instead, which will use less memory (RAM).
:::

<br />
Expand Down
5 changes: 5 additions & 0 deletions docs/docs/guides/ERRORS.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -98,3 +98,8 @@ function App() {
return <Camera ref={camera} {...cameraProps} />
}
```


<br />

#### 🚀 Next section: [Troubleshooting](troubleshooting)
Loading

0 comments on commit b6a67d5

Please sign in to comment.