🛰️ WebRTC Android is Google's WebRTC pre-compiled library for Android by Stream. It reflects the recent WebRTC Protocol updates to facilitate real-time video chat using functional UI components, Kotlin extensions for Android, and Compose.
Since Google no longer supported the WebRTC library for Android for many years (even JCenter has been shut down, so the library is not available now), we decided to build our own pre-compiled WebRTC core library that reflects recent WebRTC commits with some improvements.
👉 Check out who's using WebRTC Android.
You can see the use cases of this library in the repositories below:
If you want to have a better grasp of how WebRTC works, such as basic concepts of WebRTC, relevant terminologies, and how to establish a peer-to-peer connection and communicate with the signaling server in Android, check out the articles below:
Stream Video SDK for Compose is the official Android SDK for Stream Video, a service for building video calls, audio rooms, and live-streaming applications. Stream's versatile Video SDK has been built with this webrtc-android library, and you can check out the tutorials below if you want to get more information.
Add the below dependency to your module's build.gradle
file:
dependencies {
implementation "io.getstream:stream-webrtc-android:1.1.1"
}
Snapshots of the current development version of AvatarView are available, which track the latest versions.
To import snapshot versions on your project, add the code snippet below on your gradle file.
repositories {
maven { url 'https://oss.sonatype.org/content/repositories/snapshots/' }
}
Next, add the below dependency to your module's build.gradle
file.
dependencies {
implementation "io.getstream:stream-webrtc-android:1.1.2-SNAPSHOT"
}
Once you import this library, you can use all of the org.webrtc
packge functions, such as org.webrtc.PeerConnection
and org.webrtc.VideoTrack
. For more information, you can check out the API references for WebRTC packages.
Here are the most commonly used APIs in the WebRTC library, and you can reference the documentation below:
PeerConnection
instance.VideoSink
objects, which receive a stream of video frames in real-time and it allows you to control the VideoSink
objects, such as adding, removing, enabling, and disabling.AudioSink
objects, which receive a stream of video frames in real-time and it allows you to control the AudioSink
objects, such as adding, removing, enabling, and disabling.MediaStreamTrackInterface
.IceCandidateInterface
in the C++ API.Camera2Capturer
class is used to provide video frames for a VideoTrack
(typically local) from the provided cameraId. Camera2Capturer
must be run on devices Build.VERSION_CODES.LOLLIPOP
or higher.If you want to learn more about building a video chat application for Android using WebRTC, check out the blog post below:
Stream WebRTC Android supports some useful UI components for WebRTC, such as VideoTextureViewRenderer
. First, add the dependency below to your module's build.gradle
file:
dependencies {
implementation "io.getstream:stream-webrtc-android-ui:$version"
}
VideoTextureViewRenderer
is a custom TextureView that implements VideoSink and SurfaceTextureListener.
Usually, you can use SurfaceViewRenderer to display real-time video streams on a layout if you need a simple video call screen without overlaying video frames over another one. However, it might not work well as you expect if you suppose to need to design a complex video call screen, such as one video call layout should overlay another video call layout, such as the example below:
For this case, we'd recommend you use VideoTextureViewRenderer
like the example below:
<io.getstream.webrtc.android.ui.VideoTextureViewRenderer
android:id="@+id/participantVideoRenderer"
android:layout_width="match_parent"
android:layout_height="match_parent"
/>
You can add or remove VideoTrack like the below:
videoTrack.video.addSink(participantVideoRenderer)
videoTrack.video.removeSink(participantVideoRenderer)
Stream WebRTC Android supports some Jetpack Compose components for WebRTC, such as VideoRenderer
and FloatingVideoRenderer
. First, add the dependency below to your module's build.gradle
file:
dependencies {
implementation "io.getstream:stream-webrtc-android-compose:$version"
}
VideoRenderer
is a composable function that renders a single video track in Jetpack Compose.
VideoRenderer(
videoTrack = remoteVideoTrack,
modifier = Modifier.fillMaxSize()
eglBaseContext = eglBaseContext,
rendererEvents = rendererEvents
)
You can observe the rendering state changes by giving RendererEvents
interface like the below:
val rendererEvents = object : RendererEvents {
override fun onFirstFrameRendered() { .. }
override fun onFrameResolutionChanged(videoWidth: Int, videoHeight: Int, rotation: Int) { .. }
}
FloatingVideoRenderer
represents a floating item that features a participant video, usually the local participant. You can use this composable function to overlay a single video track on another, and users can move the floating video track with user interactions.
You can use FloatingVideoRenderer
with VideoRenderer
like the example below:
var parentSize: IntSize by remember { mutableStateOf(IntSize(0, 0)) }
if (remoteVideoTrack != null) {
VideoRenderer(
videoTrack = remoteVideoTrack,
modifier = Modifier
.fillMaxSize()
.onSizeChanged { parentSize = it },
eglBaseContext = eglBaseContext,
rendererEvents = rendererEvents
)
}
if (localVideoTrack != null) {
FloatingVideoRenderer(
modifier = Modifier
.size(width = 150.dp, height = 210.dp)
.clip(RoundedCornerShape(16.dp))
.align(Alignment.TopEnd),
videoTrack = localVideoTrack,
parentBounds = parentSize,
paddingValues = PaddingValues(0.dp),
eglBaseContext = eglBaseContexteglBaseContext,
rendererEvents = rendererEvents
)
}
Stream WebRTC Android supports some useful extensions for WebRTC based on Kotlin's Coroutines. First, add the dependency below to your module's build.gradle
file:
dependencies {
implementation "io.getstream:stream-webrtc-android-ktx:$version"
}
addRtcIceCandidate
is a suspend function that allows you to add a given IceCandidate
to a PeerConnection
. So you can add an IceCandidate
to a PeerConnection
as Coroutines-style, not callback-style.
pendingIceMutex.withLock {
pendingIceCandidates.forEach { iceCandidate ->
connection.addRtcIceCandidate(iceCandidate)
}
pendingIceCandidates.clear()
}
You can create a SessionDescription
, which delegates SdpObserver
with Coroutines styles:
suspend fun createAnswer(): Result<SessionDescription> {
return createSessionDescription { sdpObserver -> connection.createAnswer(sdpObserver, mediaConstraints) }
}
This is an instruction for setting up Chromium Dev Tool if you need to compile the WebRTC core library by yourself with this project.
You need to set up depot tools to build & fetch Chromium codebase.
You should fetch the chromium WebRTC repository from the Google's repository against HEAD commits.
Note: Chromium WebRTC core libraries can be bulit only in Linux OS. Every step takes its own time based on the machine specs and internet speed, so make sure every step is completed without interruption.
You need to set up AWS instance (pre-requiests):
To compile the pre-built WebRTC library for Android, you must follow the steps below:
1. git clone https://chromium.googlesource.com/chromium/tools/depot_tools.git
2. export PATH="$PATH:${HOME}/depot_tools"
3. mkdir webrtc_android && cd webrtc_android
4. fetch --nohooks webrtc_android
5. gclient sync
6. cd src && ./build/install-build-deps.sh
7. git branch -r
8. git checkout origin/master
# To check you're in origin/master branch and check out to a specific branch if you want.
9. git branch
10. Replace Android sources & NDK/C/C++ files with this repository.
11. tools_webrtc/android/build_aar.py
To install all required dependencies for linux, a script is provided for Ubuntu, which is unfortunately only available after your first gclient sync and make sure your current directory is webrtc_android/src/
:
cd src && ./build/install-build-deps.sh
You can see the available latest branches looks like the screenshoos below:
Now you can checkout to the latest branch which is branch-heads/m79
or something, using this command:
git checkout branch-heads/m79
However, this project reflects the latest updates for WebRTC, so you must check out to the master branch like this:
8. git checkout origin/master
This will help you to resolve most of compilation issues. To get the details about your current branch you can simply use these commands:
9. git branch
This process will manually compile the source code for each particular CPU type. Manual Compiling involves these two steps:
This step will compile the library for Debug and Release modes of Development.
Ensure your current working directory is webrtc_android/src/ of your workspace. Then run:
11. gn gen out/Debug --args='target_os="android" target_cpu="arm"'
11. gn gen out/Release --args='is_debug=false is_component_build=false rtc_include_tests=false target_os="android" target_cpu="arm"'
You can specify a directory of your own choice instead of out/Debug, to enable managing multiple configurations in parallel.
For compilation you can simply use these following commands for (out/Debug, out/Release):
11. ninja -C out/Debug
11. ninja -C out/Release
This is the most simple process, which compiles the source code for all supported CPU types such as:
After compiling the package, it includes all these native libraries and .jar
library into *.aar
file.
Make sure your current working directory is webrtc_android/src/
of your workspace. Then run:
11. tools_webrtc/android/build_aar.py
This process will take some time based on your machine specs and internet speed, so here we go:
Now, if you look in the webrtc_android/src/
directory, It turns out that you will end up with the compilation and building of libwebrtc.aar
.
Support it by joining stargazers for this repository. ⭐️
Also, follow maintainers on GitHub for our next creations! 🤩
Copyright 2023 Stream.IO, Inc. All Rights Reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。