Skip to content

livekit/client-sdk-android

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

The LiveKit icon, the name of the repository and some sample code in the background.

Android Kotlin SDK for LiveKit

Maven Central

Use this SDK to add realtime video, audio and data features to your Android/Kotlin app. By connecting to LiveKit Cloud or a self-hosted server, you can quickly build applications such as multi-modal AI, live streaming, or video calls with just a few lines of code.

Table of Contents

Docs

Docs and guides at https://docs.livekit.io.

API reference can be found at https://docs.livekit.io/client-sdk-android/index.html .

Note

This is v2 of the Android SDK. When migrating from v1.x to v2.x you might encounter a small set of breaking changes. Read the migration guide for a detailed overview of what has changed.

Installation

LiveKit for Android is available as a Maven package.

...
dependencies {
  def livekit_version = "2.10.0"

  implementation "io.livekit:livekit-android:$livekit_version"
  // CameraX support with pinch to zoom, torch control, etc.
  implementation "io.livekit:livekit-android-camerax:$livekit_version"

  // Snapshots of the latest development version are available at:
  // implementation "io.livekit:livekit-android:2.10.1-SNAPSHOT"
}

Compose-based apps should check out our Android Components SDK for composables support.

You'll also need JitPack as one of your repositories. In your settings.gradle file:

dependencyResolutionManagement {
    repositories {
        google()
        mavenCentral()
        //...
        maven { url 'https://jitpack.io' }

        // For SNAPSHOT access
        // maven { url 'https://s01.oss.sonatype.org/content/repositories/snapshots/' }
    }
}

Usage

Permissions

LiveKit relies on the RECORD_AUDIO and CAMERA permissions to use the microphone and camera. These permission must be requested at runtime. Reference the sample app for an example.

Publishing camera and microphone

room.localParticipant.setCameraEnabled(true)
room.localParticipant.setMicrophoneEnabled(true)

Sharing screen

// create an intent launcher for screen capture
// this *must* be registered prior to onCreate(), ideally as an instance val
val screenCaptureIntentLauncher = registerForActivityResult(
    ActivityResultContracts.StartActivityForResult()
) { result ->
    val resultCode = result.resultCode
    val data = result.data
    if (resultCode != Activity.RESULT_OK || data == null) {
        return@registerForActivityResult
    }
    lifecycleScope.launch {
        room.localParticipant.setScreenShareEnabled(true, data)
    }
}

// when it's time to enable the screen share, perform the following
val mediaProjectionManager =
    getSystemService(MEDIA_PROJECTION_SERVICE) as MediaProjectionManager
screenCaptureIntentLauncher.launch(mediaProjectionManager.createScreenCaptureIntent())

Rendering subscribed tracks

LiveKit uses SurfaceViewRenderer to render video tracks. A TextureView implementation is also provided through TextureViewRenderer. Subscribed audio tracks are automatically played.

class MainActivity : AppCompatActivity() {

    lateinit var room: Room

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)

        setContentView(R.layout.activity_main)

        // Create Room object.
        room = LiveKit.create(applicationContext)

        // Setup the video renderer
        room.initVideoRenderer(findViewById<SurfaceViewRenderer>(R.id.renderer))

        connectToRoom()
    }

    private fun connectToRoom() {

        val url = "wss://your_host"
        val token = "your_token"

        lifecycleScope.launch {

            // Setup event handling.
            launch {
                room.events.collect { event ->
                    when (event) {
                        is RoomEvent.TrackSubscribed -> onTrackSubscribed(event)
                        else -> {}
                    }
                }
            }

            // Connect to server.
            room.connect(
                url,
                token,
            )

            // Turn on audio/video recording.
            val localParticipant = room.localParticipant
            localParticipant.setMicrophoneEnabled(true)
            localParticipant.setCameraEnabled(true)
        }
    }

    private fun onTrackSubscribed(event: RoomEvent.TrackSubscribed) {
        val track = event.track
        if (track is VideoTrack) {
            attachVideo(track)
        }
    }

    private fun attachVideo(videoTrack: VideoTrack) {
        videoTrack.addRenderer(findViewById<SurfaceViewRenderer>(R.id.renderer))
        findViewById<View>(R.id.progress).visibility = View.GONE
    }
}

See the basic sample app for the full implementation.

Audio modes

By default, the audio is configured for two-way communications.

If you are building a livestreaming or media playback focus app, you can use the preset MediaAudioType when creating the Room object for better audio quality.

val room = LiveKit.create(
    appContext = application,
    overrides = LiveKitOverrides(
        audioOptions = AudioOptions(
            audioOutputType = AudioType.MediaAudioType()
        )
    )
)

Note: audio routing becomes automatically handled by the system and cannot be manually controlled.

For more control over the specific audio attributes and modes, a CustomAudioType can be passed instead.

@FlowObservable

Properties marked with @FlowObservable can be accessed as a Kotlin Flow to observe changes directly:

coroutineScope.launch {
    room::activeSpeakers.flow.collectLatest { speakersList ->
        /*...*/
    }
}

Sample App

Note: If you wish to run the sample apps directly from this repo, please consult the Dev Environment instructions.

We have a basic quickstart sample app here, showing how to connect to a room, publish your device's audio/video, and display the video of one remote participant.

There are two more full featured video conferencing sample apps:

They both use the CallViewModel , which handles the Room connection and exposes the data needed for a basic video conferencing app.

The respective ParticipantItem class in each app is responsible for the displaying of each participant's UI.

Dev Environment

To develop the Android SDK or running the sample app directly from this repo, you'll need:

  • Clone the repo to your computer
  • Ensure the protocol submodule repo is initialized and updated
git clone https://github.com/livekit/client-sdk-android.git
cd client-sdk-android
git submodule update --init

For those developing on Macs with Apple silicon (e.g. M1, M2, etc.), please add below to $HOME/.gradle/gradle.properties

protoc_platform=osx-x86_64

Optional (Dev convenience)

  1. Download webrtc sources from https://webrtc.googlesource.com/src
  2. Add sources to Android Studio by pointing at the webrtc/sdk/android folder.


LiveKit Ecosystem
Realtime SDKsReact Components · Browser · Swift Components · iOS/macOS/visionOS · Android · Flutter · React Native · Rust · Node.js · Python · Unity (web) · Unity (beta)
Server APIsNode.js · Golang · Ruby · Java/Kotlin · Python · Rust · PHP (community)
Agents FrameworksPython · Playground
ServicesLiveKit server · Egress · Ingress · SIP
ResourcesDocs · Example apps · Cloud · Self-hosting · CLI