Skip to content

Commit

Permalink
Squashed commit of the following:
Browse files Browse the repository at this point in the history
commit aa0a21d
Author: Jacob Gelman <3182119+ladvoc@users.noreply.github.com>
Date:   Thu Feb 13 06:28:21 2025 -0800

    Simplify broadcast extension setup with standard format for identifiers (livekit#573)

    When configuring a broadcast extension, manually setting the info keys
    `RTCAppGroupIdentifier` and `RTCScreenSharingExtension` is no longer
    required when using the standard format. The standard format is as
    follows:
    - App group: `group.<main-app-bundle-id>`
    - Broadcast extension: `<main-app-bundle-id>.broadcast`

    ---------

    Co-authored-by: Hiroshi Horie <548776+hiroshihorie@users.noreply.github.com>

commit c3ee701
Author: Jacob Gelman <3182119+ladvoc@users.noreply.github.com>
Date:   Thu Feb 13 06:05:00 2025 -0800

    Deprecate public broadcast picker extension (livekit#586)

    Public show method defined as an extension to
    `RPSystemBroadcastPickerView` has been deprecated in favor of
    `BroadcastManager.shared.requestActivation()`.

    ---------

    Co-authored-by: Hiroshi Horie <548776+hiroshihorie@users.noreply.github.com>

commit 0da6660
Author: Jacob Gelman <3182119+ladvoc@users.noreply.github.com>
Date:   Thu Feb 13 05:55:25 2025 -0800

    Release automation (livekit#579)

    - Add version and platform compatibility badges from [Swift Package
    Index](https://swiftpackageindex.com/) to README
      - Automatically updated on each release
    - Add [nanpa](https://github.com/nbsp/nanpa) configuration
    - Custom script bumps version across repo (currently Podspec, README,
    and LiveKitSDK class)
      - GitHub publish workflow (based on workflow from livekit/rust-sdks)
    - Create workflow to push new releases to Cocopods when a release is
    published on GitHub

    I have also added changeset files to my currently open PRs (livekit#565, livekit#576,
    and livekit#573) that can be used to test this configuration.

    ---------

    Co-authored-by: Hiroshi Horie <548776+hiroshihorie@users.noreply.github.com>

commit 5b031c8
Author: Hiroshi Horie <548776+hiroshihorie@users.noreply.github.com>
Date:   Thu Feb 13 22:36:35 2025 +0900

    Update Podspec (livekit#587)

    Fixes: livekit#566

commit 485e76d
Author: Hiroshi Horie <548776+hiroshihorie@users.noreply.github.com>
Date:   Wed Feb 12 14:36:48 2025 +0900

    macOS screen share audio (livekit#561)
  • Loading branch information
hiroshihorie committed Feb 13, 2025
1 parent 4a44ee9 commit 3b4e327
Show file tree
Hide file tree
Showing 20 changed files with 380 additions and 59 deletions.
17 changes: 17 additions & 0 deletions .github/workflows/bump.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
name: Bump version
on:
workflow_dispatch:
env:
PACKAGE_NAME: client-sdk-swift
jobs:
bump:
runs-on: ubuntu-latest
permissions:
contents: write
steps:
- uses: actions/checkout@v3
with:
ssh-key: ${{ secrets.NANPA_KEY }}
- uses: nbsp/ilo@v1
with:
packages: ${{ env.PACKAGE_NAME }}
20 changes: 20 additions & 0 deletions .github/workflows/cocoapods.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
name: Push to Cocoapods
on:
workflow_dispatch:
release:
types: [published]
env:
PODSPEC_FILE: LiveKitClient.podspec
jobs:
build:
runs-on: macos-latest
steps:
- uses: actions/checkout@v3
- name: Install Cocoapods
run: gem install cocoapods
- name: Validate Podspec
run: pod lib lint --allow-warnings
- name: Publish to CocoaPods
run: pod trunk push ${{ env.PODSPEC_FILE }} --allow-warnings --verbose
env:
COCOAPODS_TRUNK_TOKEN: ${{ secrets.COCOAPODS_TRUNK_TOKEN }}
4 changes: 4 additions & 0 deletions .nanpa/.keep
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
Add changeset files in this directory.

See nanpa documentation for more info:
https://github.com/nbsp/nanpa/blob/trunk/doc/nanpa-changeset.5.scd
1 change: 1 addition & 0 deletions .nanpa/broadcast-identifier-defaults.kdl
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
patch type="change" "Simplify broadcast extension setup with standard format for identifiers"
1 change: 1 addition & 0 deletions .nanpa/make-extension-private.kdl
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
patch type="deprecated" "Deprecated public method to show broadcast picker"
3 changes: 3 additions & 0 deletions .nanparc
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
version 2.1.1
name client-sdk-swift
custom ./scripts/replace_version.sh
28 changes: 16 additions & 12 deletions Docs/ios-screen-sharing.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,15 +44,17 @@ flowchart LR

To use the Broadcast Capture mode, follow these steps to add a Broadcast Upload Extension target and associated configuration to your project. You can also refer to the [example app](https://github.com/livekit-examples/swift-example), which demonstrates this configuration.

#### 1. Add Broadcast Upload Extension Target

#### 1. Add Broadcast Upload Extension Target

<img src="Resources/new-target-options.png" width="500" />

1. In Xcode, Choose "File" > "New > "Target"
2. From the template chooser, select "Broadcast Upload Extension"
3. Name the extension (e.g. "BroadcastExtension"). Take note of the extension's bundle identifier, as it will be needed later.
4. Replace the default content of `SampleHandler.swift` in the new target with the following:
3. Name the extension (e.g. "BroadcastExtension").
4. Press "Finish"
5. From the "Signing & Capabilities" tab of the new target, change the bundle identifier to be the same as your main app with `.broadcast` added to the end. To use a custom identifier, see *[Custom Identifiers](#custom-identifiers)* below.
6. Replace the default content of `SampleHandler.swift` in the new target with the following:

```swift
import LiveKit
Expand All @@ -75,14 +77,9 @@ In order for the broadcast extension to communicate with your app, they must be
2. Select the "Signing & Capabilities" tab and press the "+ Capability" button.
3. Add the "App Groups" capability.
4. Press "+" to add a new app group.
5. Enter an app group identifier in the format `group.<domain>.<group_name>`. Be sure to use the same identifier for both targets.

#### 3. Add Properties to Info.plist

1. Set `RTCAppGroupIdentifier` in the Info.plist of **both targets** to the group identifier from the previous step.
2. Set `RTCScreenSharingExtension` in the Info.plist of your **primary app target** to the broadcast extension's bundle identifier.
5. Add the target to the group `group.<main-app-bundle-id>`. To use a custom identifier, see *[Custom Identifiers](#custom-identifiers)* below.

#### 4. Begin Screen Share
#### 3. Begin Screen Share

With setup of the broadcast extension complete, broadcast capture will be used by default when enabling screen share:
```swift
Expand All @@ -102,6 +99,8 @@ While running your app in a debug session in Xcode, check the debug console for

### Advanced Usage

#### Manual Track Publication

When using broadcast capture, a broadcast can be initiated externally (for example, via control center). By default, when a broadcast begins, the local participant automatically publishes a screen share track. In some cases, however, you may want to handle track publication manually. You can achieve this by using `BroadcastManager`:

First, disable automatic track publication:
Expand All @@ -111,7 +110,7 @@ BroadcastManager.shared.shouldPublishTrack = false

Then, use one of the two methods for detecting changes in the broadcast state:

#### Combine Publisher
##### Combine Publisher
```swift
let subscription = BroadcastManager.shared
.isBroadcastingPublisher
Expand All @@ -120,7 +119,7 @@ let subscription = BroadcastManager.shared
}
```

#### Delegate
##### Delegate
```swift
class MyDelegate: BroadcastManagerDelegate {
func broadcastManager(didChangeState isBroadcasting: Bool) {
Expand All @@ -129,3 +128,8 @@ class MyDelegate: BroadcastManagerDelegate {
}
BroadcastManager.shared.delegate = MyDelegate()
```

#### Custom Identifiers

By default, the app group identifier is expected to be `group.<main-app-bundle-id>`, and the broadcast extension's bundle identifier is expected to be `<main-app-bundle-id>.broadcast`.
To override these values, set `RTCAppGroupIdentifier` in Info.plist for both targets (both broadcast extension and main app), and set `RTCScreenSharingExtension` in Info.plist for your main app.
4 changes: 3 additions & 1 deletion LiveKitClient.podspec
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,15 @@ Pod::Spec.new do |spec|

spec.ios.deployment_target = "13.0"
spec.osx.deployment_target = "10.15"
spec.tvos.deployment_target = "17.0"
spec.visionos.deployment_target = "1.0"

spec.swift_versions = ["5.7"]
spec.source = {:git => "https://github.com/livekit/client-sdk-swift.git", :tag => "2.1.1"}

spec.source_files = "Sources/**/*"

spec.dependency("LiveKitWebRTC", "= 125.6422.11")
spec.dependency("LiveKitWebRTC", "= 125.6422.18")
spec.dependency("SwiftProtobuf")
spec.dependency("Logging")

Expand Down
3 changes: 3 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,9 @@
Use this SDK to add realtime video, audio and data features to your Swift app. By connecting to <a href="https://livekit.io/">LiveKit</a> Cloud or a self-hosted server, you can quickly build applications such as multi-modal AI, live streaming, or video calls with just a few lines of code.
<!--END_DESCRIPTION-->

[![](https://img.shields.io/endpoint?url=https%3A%2F%2Fswiftpackageindex.com%2Fapi%2Fpackages%2Flivekit%2Fclient-sdk-swift%2Fbadge%3Ftype%3Dswift-versions)](https://swiftpackageindex.com/livekit/client-sdk-swift)
[![](https://img.shields.io/endpoint?url=https%3A%2F%2Fswiftpackageindex.com%2Fapi%2Fpackages%2Flivekit%2Fclient-sdk-swift%2Fbadge%3Ftype%3Dplatforms)](https://swiftpackageindex.com/livekit/client-sdk-swift)

## Docs & Example app

> [!NOTE]
Expand Down
156 changes: 156 additions & 0 deletions Sources/LiveKit/Audio/DefaultMixerAudioObserver.swift
Original file line number Diff line number Diff line change
@@ -0,0 +1,156 @@
/*
* Copyright 2025 LiveKit
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

@preconcurrency import AVFoundation

#if swift(>=5.9)
internal import LiveKitWebRTC
#else
@_implementationOnly import LiveKitWebRTC
#endif

public final class DefaultMixerAudioObserver: AudioEngineObserver, Loggable {
public var next: (any AudioEngineObserver)? {
get { _state.next }
set { _state.mutate { $0.next = newValue } }
}

/// Adjust the volume of captured app audio. Range is 0.0 ~ 1.0.
public var appVolume: Float {
get { _state.read { $0.appMixerNode.outputVolume } }
set { _state.mutate { $0.appMixerNode.outputVolume = newValue } }
}

/// Adjust the volume of microphone audio. Range is 0.0 ~ 1.0.
public var micVolume: Float {
get { _state.read { $0.micMixerNode.outputVolume } }
set { _state.mutate { $0.micMixerNode.outputVolume = newValue } }
}

// MARK: - Internal

var appAudioNode: AVAudioPlayerNode {
_state.read { $0.appNode }
}

var micAudioNode: AVAudioPlayerNode {
_state.read { $0.micNode }
}

var isConnected: Bool {
_state.read { $0.isConnected }
}

struct State {
var next: (any AudioEngineObserver)?

// AppAudio
public let appNode = AVAudioPlayerNode()
public let appMixerNode = AVAudioMixerNode()

// Not connected for device rendering mode.
public let micNode = AVAudioPlayerNode()
public let micMixerNode = AVAudioMixerNode()

public var isConnected: Bool = false
}

let _state = StateSync(State())

public init() {}

public func setNext(_ handler: any AudioEngineObserver) {
next = handler
}

public func engineDidCreate(_ engine: AVAudioEngine) {
let (appNode, appMixerNode, micNode, micMixerNode) = _state.read {
($0.appNode, $0.appMixerNode, $0.micNode, $0.micMixerNode)
}

engine.attach(appNode)
engine.attach(appMixerNode)
engine.attach(micNode)
engine.attach(micMixerNode)

// Invoke next
next?.engineDidCreate(engine)
}

public func engineWillRelease(_ engine: AVAudioEngine) {
// Invoke next
next?.engineWillRelease(engine)

let (appNode, appMixerNode, micNode, micMixerNode) = _state.read {
($0.appNode, $0.appMixerNode, $0.micNode, $0.micMixerNode)
}

engine.detach(appNode)
engine.detach(appMixerNode)
engine.detach(micNode)
engine.detach(micMixerNode)
}

public func engineWillConnectInput(_ engine: AVAudioEngine, src: AVAudioNode?, dst: AVAudioNode, format: AVAudioFormat, context: [AnyHashable: Any]) {
// Get the main mixer
guard let mainMixerNode = context[kRTCAudioEngineInputMixerNodeKey] as? AVAudioMixerNode else {
// If failed to get main mixer, call next and return.
next?.engineWillConnectInput(engine, src: src, dst: dst, format: format, context: context)
return
}

// Read nodes from state lock.
let (appNode, appMixerNode, micNode, micMixerNode) = _state.read {
($0.appNode, $0.appMixerNode, $0.micNode, $0.micMixerNode)
}

// TODO: Investigate if possible to get this format prior to starting screen capture.
// <AVAudioFormat 0x600003055180: 2 ch, 48000 Hz, Float32, deinterleaved>
let appAudioNodeFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32,
sampleRate: format.sampleRate, // Assume same sample rate
channels: 2,
interleaved: false)

log("Connecting app -> appMixer -> mainMixer")
// appAudio -> appAudioMixer -> mainMixer
engine.connect(appNode, to: appMixerNode, format: appAudioNodeFormat)
engine.connect(appMixerNode, to: mainMixerNode, format: format)

// src is not null if device rendering mode.
if let src {
log("Connecting src (device) to micMixer -> mainMixer")
// mic (device) -> micMixer -> mainMixer
engine.connect(src, to: micMixerNode, format: format)
}

// TODO: Investigate if possible to get this format prior to starting screen capture.
let micNodeFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32,
sampleRate: format.sampleRate, // Assume same sample rate
channels: 1, // Mono
interleaved: false)

log("Connecting micAudio (player) to micMixer -> mainMixer")
// mic (player) -> micMixer -> mainMixer
engine.connect(micNode, to: micMixerNode, format: micNodeFormat)
// Always connect micMixer to mainMixer
engine.connect(micMixerNode, to: mainMixerNode, format: format)

_state.mutate { $0.isConnected = true }

// Invoke next
next?.engineWillConnectInput(engine, src: src, dst: dst, format: format, context: context)
}
}
32 changes: 27 additions & 5 deletions Sources/LiveKit/Broadcast/BroadcastBundleInfo.swift
Original file line number Diff line number Diff line change
Expand Up @@ -20,12 +20,19 @@ import Foundation

enum BroadcastBundleInfo {
/// Identifier of the app group shared by the primary app and broadcast extension.
@BundleInfo("RTCAppGroupIdentifier")
static var groupIdentifier: String?

static var groupIdentifier: String? {
if let override = groupIdentifierOverride { return override }
guard let bundleIdentifier = Bundle.main.bundleIdentifier else { return nil }
let appBundleIdentifier = bundleIdentifier.dropSuffix(".\(extensionSuffix)") ?? bundleIdentifier
return "group.\(appBundleIdentifier)"
}

/// Bundle identifier of the broadcast extension.
@BundleInfo("RTCScreenSharingExtension")
static var screenSharingExtension: String?
static var screenSharingExtension: String? {
if let override = screenSharingExtensionOverride { return override }
guard let bundleIdentifier = Bundle.main.bundleIdentifier else { return nil }
return "\(bundleIdentifier).\(extensionSuffix)"
}

/// Path to the socket file used for interprocess communication.
static var socketPath: SocketPath? {
Expand All @@ -37,7 +44,14 @@ enum BroadcastBundleInfo {
static var hasExtension: Bool {
socketPath != nil && screenSharingExtension != nil
}

@BundleInfo("RTCAppGroupIdentifier")
private static var groupIdentifierOverride: String?

@BundleInfo("RTCScreenSharingExtension")
private static var screenSharingExtensionOverride: String?

private static let extensionSuffix = "broadcast"
private static let socketFileDescriptor = "rtc_SSFD"

private static func socketPath(for groupIdentifier: String) -> SocketPath? {
Expand All @@ -49,4 +63,12 @@ enum BroadcastBundleInfo {
}
}

private extension String {
func dropSuffix(_ suffix: String) -> Self? {
guard hasSuffix(suffix) else { return nil }
let trailingIndex = index(endIndex, offsetBy: -suffix.count)
return String(self[..<trailingIndex])
}
}

#endif
18 changes: 15 additions & 3 deletions Sources/LiveKit/Broadcast/BroadcastManager.swift
Original file line number Diff line number Diff line change
Expand Up @@ -70,9 +70,8 @@ public final class BroadcastManager: Sendable {
///
public func requestActivation() {
Task {
await RPSystemBroadcastPickerView.show(
for: BroadcastBundleInfo.screenSharingExtension,
showsMicrophoneButton: false
await RPSystemBroadcastPickerView.showPicker(
for: BroadcastBundleInfo.screenSharingExtension
)
}
}
Expand Down Expand Up @@ -116,4 +115,17 @@ public protocol BroadcastManagerDelegate {
func broadcastManager(didChangeState isBroadcasting: Bool)
}

private extension RPSystemBroadcastPickerView {
/// Convenience function to show broadcast picker.
static func showPicker(for preferredExtension: String?) {
let view = RPSystemBroadcastPickerView()
view.preferredExtension = preferredExtension
view.showsMicrophoneButton = false

let selector = NSSelectorFromString("buttonPressed:")
guard view.responds(to: selector) else { return }
view.perform(selector, with: nil)
}
}

#endif
Loading

0 comments on commit 3b4e327

Please sign in to comment.