Skip to content

Commit

Permalink
Merge remote-tracking branch 'upstream/main' into broadcast-ipc
Browse files Browse the repository at this point in the history
  • Loading branch information
ladvoc committed Feb 3, 2025
2 parents 7c4f8ea + fbec343 commit 3205da2
Show file tree
Hide file tree
Showing 18 changed files with 613 additions and 117 deletions.
61 changes: 34 additions & 27 deletions Docs/ios-screen-sharing.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,37 +82,14 @@ In order for the broadcast extension to communicate with your app, they must be
1. Set `RTCAppGroupIdentifier` in the Info.plist of **both targets** to the group identifier from the previous step.
2. Set `RTCScreenSharingExtension` in the Info.plist of your **primary app target** to the broadcast extension's bundle identifier.

#### 4. Use Broadcast Extension

To use the broadcast extension for screen sharing, create an instance of `ScreenShareCaptureOptions`, setting the `useBroadcastExtension` property to `true`. The following example demonstrates making this the default for a room when connecting:

```swift
let options = RoomOptions(
defaultScreenShareCaptureOptions: ScreenShareCaptureOptions(
useBroadcastExtension: true
),
// other options...
)
room.connect(url: wsURL, token: token, roomOptions: options)
```

When connecting to a room declaratively using the `RoomScope` view from the SwiftUI components package, use the initializer's optional `roomOptions` parameter to pass the room options object:
#### 4. Begin Screen Share

With setup of the broadcast extension complete, broadcast capture will be used by default when enabling screen share:
```swift
RoomScope(url: wsURL, token: token, roomOptions: options) {
// your components here
}
try await room.localParticipant.setScreenShare(enabled: true)
```

It is also possible to use the broadcast extension when enabling screen share without making it the default for the room:

```swift
try await room.localParticipant.set(
source: .screenShareVideo,
enabled: true,
captureOptions: ScreenShareCaptureOptions(useBroadcastExtension: true)
)
```
<small>Note: When using broadcast capture, custom capture options must be set as room defaults rather than passed when enabling screen share with `set(source:enabled:captureOptions:publishOptions:)`.</small>

### Troubleshooting

Expand All @@ -122,3 +99,33 @@ While running your app in a debug session in Xcode, check the debug console for
2. Select your iOS device from the left sidebar and press "Start Streaming."
3. In the search bar, add a filter for messages with a category of "LKSampleHandler."
4. Initiate a screen share in your app and inspect Console for errors.

### Advanced Usage

When using broadcast capture, a broadcast can be initiated externally (for example, via control center). By default, when a broadcast begins, the local participant automatically publishes a screen share track. In some cases, however, you may want to handle track publication manually. You can achieve this by using `BroadcastManager`:

First, disable automatic track publication:
```swift
BroadcastManager.shared.shouldPublishTrack = false
```

Then, use one of the two methods for detecting changes in the broadcast state:

#### Combine Publisher
```swift
let subscription = BroadcastManager.shared
.isBroadcastingPublisher
.sink { isBroadcasting in
// Manually handle track publication
}
```

#### Delegate
```swift
class MyDelegate: BroadcastManagerDelegate {
func broadcastManager(didChangeState isBroadcasting: Bool) {
// Manually handle track publication
}
}
BroadcastManager.shared.delegate = MyDelegate()
```
2 changes: 1 addition & 1 deletion Package.swift
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ let package = Package(
],
dependencies: [
// LK-Prefixed Dynamic WebRTC XCFramework
.package(url: "https://github.com/livekit/webrtc-xcframework.git", exact: "125.6422.15"),
.package(url: "https://github.com/livekit/webrtc-xcframework.git", exact: "125.6422.16"),
.package(url: "https://github.com/apple/swift-protobuf.git", from: "1.26.0"),
.package(url: "https://github.com/apple/swift-log.git", from: "1.5.4"),
// Only used for DocC generation
Expand Down
2 changes: 1 addition & 1 deletion Package@swift-5.9.swift
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ let package = Package(
],
dependencies: [
// LK-Prefixed Dynamic WebRTC XCFramework
.package(url: "https://github.com/livekit/webrtc-xcframework.git", exact: "125.6422.15"),
.package(url: "https://github.com/livekit/webrtc-xcframework.git", exact: "125.6422.16"),
.package(url: "https://github.com/apple/swift-protobuf.git", from: "1.26.0"),
.package(url: "https://github.com/apple/swift-log.git", from: "1.5.4"),
// Only used for DocC generation
Expand Down
28 changes: 14 additions & 14 deletions Sources/LiveKit/Audio/AudioDeviceModuleDelegateAdapter.swift
Original file line number Diff line number Diff line change
Expand Up @@ -40,49 +40,49 @@ class AudioDeviceModuleDelegateAdapter: NSObject, LKRTCAudioDeviceModuleDelegate

func audioDeviceModule(_: LKRTCAudioDeviceModule, didCreateEngine engine: AVAudioEngine) {
guard let audioManager else { return }
let entryPoint = audioManager._state.engineObservers.buildChain()
let entryPoint = audioManager.buildEngineObserverChain()
entryPoint?.engineDidCreate(engine)
}

func audioDeviceModule(_: LKRTCAudioDeviceModule, willEnableEngine engine: AVAudioEngine, isPlayoutEnabled: Bool, isRecordingEnabled: Bool) {
guard let audioManager else { return }
let entryPoint = audioManager._state.engineObservers.buildChain()
let entryPoint = audioManager.buildEngineObserverChain()
entryPoint?.engineWillEnable(engine, isPlayoutEnabled: isPlayoutEnabled, isRecordingEnabled: isRecordingEnabled)
}

func audioDeviceModule(_: LKRTCAudioDeviceModule, willStartEngine engine: AVAudioEngine, isPlayoutEnabled: Bool, isRecordingEnabled: Bool) {
guard let audioManager else { return }
let entryPoint = audioManager._state.engineObservers.buildChain()
let entryPoint = audioManager.buildEngineObserverChain()
entryPoint?.engineWillStart(engine, isPlayoutEnabled: isPlayoutEnabled, isRecordingEnabled: isRecordingEnabled)
}

func audioDeviceModule(_: LKRTCAudioDeviceModule, didStopEngine engine: AVAudioEngine, isPlayoutEnabled: Bool, isRecordingEnabled: Bool) {
guard let audioManager else { return }
let entryPoint = audioManager._state.engineObservers.buildChain()
let entryPoint = audioManager.buildEngineObserverChain()
entryPoint?.engineDidStop(engine, isPlayoutEnabled: isPlayoutEnabled, isRecordingEnabled: isRecordingEnabled)
}

func audioDeviceModule(_: LKRTCAudioDeviceModule, didDisableEngine engine: AVAudioEngine, isPlayoutEnabled: Bool, isRecordingEnabled: Bool) {
guard let audioManager else { return }
let entryPoint = audioManager._state.engineObservers.buildChain()
let entryPoint = audioManager.buildEngineObserverChain()
entryPoint?.engineDidDisable(engine, isPlayoutEnabled: isPlayoutEnabled, isRecordingEnabled: isRecordingEnabled)
}

func audioDeviceModule(_: LKRTCAudioDeviceModule, willReleaseEngine engine: AVAudioEngine) {
guard let audioManager else { return }
let entryPoint = audioManager._state.engineObservers.buildChain()
let entryPoint = audioManager.buildEngineObserverChain()
entryPoint?.engineWillRelease(engine)
}

func audioDeviceModule(_: LKRTCAudioDeviceModule, engine: AVAudioEngine, configureInputFromSource src: AVAudioNode?, toDestination dst: AVAudioNode, format: AVAudioFormat) -> Bool {
guard let audioManager else { return false }
let entryPoint = audioManager._state.engineObservers.buildChain()
return entryPoint?.engineWillConnectInput(engine, src: src, dst: dst, format: format) ?? false
func audioDeviceModule(_: LKRTCAudioDeviceModule, engine: AVAudioEngine, configureInputFromSource src: AVAudioNode?, toDestination dst: AVAudioNode, format: AVAudioFormat, context: [AnyHashable: Any]) {
guard let audioManager else { return }
let entryPoint = audioManager.buildEngineObserverChain()
entryPoint?.engineWillConnectInput(engine, src: src, dst: dst, format: format, context: context)
}

func audioDeviceModule(_: LKRTCAudioDeviceModule, engine: AVAudioEngine, configureOutputFromSource src: AVAudioNode, toDestination dst: AVAudioNode?, format: AVAudioFormat) -> Bool {
guard let audioManager else { return false }
let entryPoint = audioManager._state.engineObservers.buildChain()
return entryPoint?.engineWillConnectOutput(engine, src: src, dst: dst, format: format) ?? false
func audioDeviceModule(_: LKRTCAudioDeviceModule, engine: AVAudioEngine, configureOutputFromSource src: AVAudioNode, toDestination dst: AVAudioNode?, format: AVAudioFormat, context: [AnyHashable: Any]) {
guard let audioManager else { return }
let entryPoint = audioManager.buildEngineObserverChain()
entryPoint?.engineWillConnectOutput(engine, src: src, dst: dst, format: format, context: context)
}
}
59 changes: 39 additions & 20 deletions Sources/LiveKit/Audio/AudioEngineObserver.swift
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,18 @@

import AVFAudio

#if swift(>=5.9)
internal import LiveKitWebRTC
#else
@_implementationOnly import LiveKitWebRTC
#endif

public let AudioEngineInputMixerNodeKey = kRTCAudioEngineInputMixerNodeKey

/// Do not retain the engine object.
public protocol AudioEngineObserver: NextInvokable, Sendable {
func setNext(_ handler: any AudioEngineObserver)
associatedtype Next = any AudioEngineObserver
var next: (any AudioEngineObserver)? { get set }

func engineDidCreate(_ engine: AVAudioEngine)
func engineWillEnable(_ engine: AVAudioEngine, isPlayoutEnabled: Bool, isRecordingEnabled: Bool)
Expand All @@ -30,34 +39,44 @@ public protocol AudioEngineObserver: NextInvokable, Sendable {
/// Provide custom implementation for internal AVAudioEngine's output configuration.
/// Buffers flow from `src` to `dst`. Preferred format to connect node is provided as `format`.
/// Return true if custom implementation is provided, otherwise default implementation will be used.
func engineWillConnectOutput(_ engine: AVAudioEngine, src: AVAudioNode, dst: AVAudioNode?, format: AVAudioFormat) -> Bool
func engineWillConnectOutput(_ engine: AVAudioEngine, src: AVAudioNode, dst: AVAudioNode?, format: AVAudioFormat, context: [AnyHashable: Any])
/// Provide custom implementation for internal AVAudioEngine's input configuration.
/// Buffers flow from `src` to `dst`. Preferred format to connect node is provided as `format`.
/// Return true if custom implementation is provided, otherwise default implementation will be used.
func engineWillConnectInput(_ engine: AVAudioEngine, src: AVAudioNode?, dst: AVAudioNode, format: AVAudioFormat) -> Bool
func engineWillConnectInput(_ engine: AVAudioEngine, src: AVAudioNode?, dst: AVAudioNode, format: AVAudioFormat, context: [AnyHashable: Any])
}

/// Default implementation to make it optional.
public extension AudioEngineObserver {
func engineDidCreate(_: AVAudioEngine) {}
func engineWillEnable(_: AVAudioEngine, isPlayoutEnabled _: Bool, isRecordingEnabled _: Bool) {}
func engineWillStart(_: AVAudioEngine, isPlayoutEnabled _: Bool, isRecordingEnabled _: Bool) {}
func engineDidStop(_: AVAudioEngine, isPlayoutEnabled _: Bool, isRecordingEnabled _: Bool) {}
func engineDidDisable(_: AVAudioEngine, isPlayoutEnabled _: Bool, isRecordingEnabled _: Bool) {}
func engineWillRelease(_: AVAudioEngine) {}

func engineWillConnectOutput(_: AVAudioEngine, src _: AVAudioNode, dst _: AVAudioNode?, format _: AVAudioFormat) -> Bool { false }
func engineWillConnectInput(_: AVAudioEngine, src _: AVAudioNode?, dst _: AVAudioNode, format _: AVAudioFormat) -> Bool { false }
}
func engineDidCreate(_ engine: AVAudioEngine) {
next?.engineDidCreate(engine)
}

func engineWillEnable(_ engine: AVAudioEngine, isPlayoutEnabled: Bool, isRecordingEnabled: Bool) {
next?.engineWillEnable(engine, isPlayoutEnabled: isPlayoutEnabled, isRecordingEnabled: isRecordingEnabled)
}

func engineWillStart(_ engine: AVAudioEngine, isPlayoutEnabled: Bool, isRecordingEnabled: Bool) {
next?.engineWillStart(engine, isPlayoutEnabled: isPlayoutEnabled, isRecordingEnabled: isRecordingEnabled)
}

func engineDidStop(_ engine: AVAudioEngine, isPlayoutEnabled: Bool, isRecordingEnabled: Bool) {
next?.engineDidStop(engine, isPlayoutEnabled: isPlayoutEnabled, isRecordingEnabled: isRecordingEnabled)
}

extension [any AudioEngineObserver] {
func buildChain() -> Element? {
guard let first else { return nil }
func engineDidDisable(_ engine: AVAudioEngine, isPlayoutEnabled: Bool, isRecordingEnabled: Bool) {
next?.engineDidDisable(engine, isPlayoutEnabled: isPlayoutEnabled, isRecordingEnabled: isRecordingEnabled)
}

for i in 0 ..< count - 1 {
self[i].setNext(self[i + 1])
}
func engineWillRelease(_ engine: AVAudioEngine) {
next?.engineWillRelease(engine)
}

func engineWillConnectOutput(_ engine: AVAudioEngine, src: AVAudioNode, dst: AVAudioNode?, format: AVAudioFormat, context: [AnyHashable: Any]) {
next?.engineWillConnectOutput(engine, src: src, dst: dst, format: format, context: context)
}

return first
func engineWillConnectInput(_ engine: AVAudioEngine, src: AVAudioNode?, dst: AVAudioNode, format: AVAudioFormat, context: [AnyHashable: Any]) {
next?.engineWillConnectInput(engine, src: src, dst: dst, format: format, context: context)
}
}
13 changes: 7 additions & 6 deletions Sources/LiveKit/Audio/DefaultAudioSessionObserver.swift
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ internal import LiveKitWebRTC
@_implementationOnly import LiveKitWebRTC
#endif

public final class DefaultAudioSessionObserver: AudioEngineObserver, Loggable {
public class DefaultAudioSessionObserver: AudioEngineObserver, Loggable, @unchecked Sendable {
struct State {
var isSessionActive = false
var next: (any AudioEngineObserver)?
Expand All @@ -36,7 +36,12 @@ public final class DefaultAudioSessionObserver: AudioEngineObserver, Loggable {

let _state = StateSync(State())

init() {
public var next: (any AudioEngineObserver)? {
get { _state.next }
set { _state.mutate { $0.next = newValue } }
}

public init() {
// Backward compatibility with `customConfigureAudioSessionFunc`.
_state.onDidMutate = { new_, old_ in
if let config_func = AudioManager.shared._state.customConfigureFunc,
Expand All @@ -51,10 +56,6 @@ public final class DefaultAudioSessionObserver: AudioEngineObserver, Loggable {
}
}

public func setNext(_ nextHandler: any AudioEngineObserver) {
_state.mutate { $0.next = nextHandler }
}

public func engineWillEnable(_ engine: AVAudioEngine, isPlayoutEnabled: Bool, isRecordingEnabled: Bool) {
if AudioManager.shared._state.customConfigureFunc == nil {
log("Configuring audio session...")
Expand Down
53 changes: 53 additions & 0 deletions Sources/LiveKit/Broadcast/BroadcastBundleInfo.swift
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
/*
* Copyright 2025 LiveKit
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

#if os(iOS)

import Foundation

final class BroadcastBundleInfo {

/// Identifier of the app group shared by the primary app and broadcast extension.
@BundleInfo("RTCAppGroupIdentifier")
static var groupIdentifier: String?

/// Bundle identifier of the broadcast extension.
@BundleInfo("RTCScreenSharingExtension")
static var screenSharingExtension: String?

/// Path to the socket file used for interprocess communication.
static var socketPath: SocketPath? {
guard let groupIdentifier else { return nil }
return Self.socketPath(for: groupIdentifier)
}

/// Whether or not a broadcast extension has been configured.
static var hasExtension: Bool {
socketPath != nil && screenSharingExtension != nil
}

private static let socketFileDescriptor = "rtc_SSFD"

private static func socketPath(for groupIdentifier: String) -> SocketPath? {
guard let sharedContainer = FileManager.default
.containerURL(forSecurityApplicationGroupIdentifier: groupIdentifier)
else { return nil }
let path = sharedContainer.appendingPathComponent(Self.socketFileDescriptor).path
return SocketPath(path)
}
}

#endif
Loading

0 comments on commit 3205da2

Please sign in to comment.