Giter Site home page Giter Site logo

getstream / webrtc-android Goto Github PK

View Code? Open in Web Editor NEW
476.0 10.0 48.0 58.4 MB

🛰️ A versatile WebRTC pre-compiled Android library that reflects the recent WebRTC updates to facilitate real-time video chat for Android and Compose.

Home Page: https://getstream.github.io/webrtc-android/

License: Apache License 2.0

Kotlin 60.04% Shell 0.05% Java 39.91%
android getstream jetpack jetpack-compose kotlin rtc video-chat webrtc webrtc-android

webrtc-android's Introduction

AndroidWebRTC-1200x630px

WebRTC Android by Stream


License API Build Status Stream Feeds

🛰️ WebRTC Android is Google's WebRTC pre-compiled library for Android by Stream. It reflects the recent WebRTC Protocol updates to facilitate real-time video chat using functional UI components, Kotlin extensions for Android, and Compose.

Agenda

Since Google no longer supported the WebRTC library for Android for many years (even JCenter has been shut down, so the library is not available now), we decided to build our own pre-compiled WebRTC core library that reflects recent WebRTC commits with some improvements.

Who's Using WebRTC Android?

👉 Check out who's using WebRTC Android.

📱 Use Cases

You can see the use cases of this library in the repositories below:

  • stream-video-android: 📲 An official Android Video SDK by Stream, which consists of versatile Core + Compose UI component libraries that allow you to build video calling, audio room, and, live streaming apps based on Webrtc running on Stream's global edge network.
  • webrtc-in-jetpack-compose: 📱 This project demonstrates WebRTC protocol to facilitate real-time video communications with Jetpack Compose.

✍️ Technical Content

If you want to have a better grasp of how WebRTC works, such as basic concepts of WebRTC, relevant terminologies, and how to establish a peer-to-peer connection and communicate with the signaling server in Android, check out the articles below:

🛥 Stream Chat and Voice & Video Calling SDK

Stream Video SDK for Compose is the official Android SDK for Stream Video, a service for building video calls, audio rooms, and live-streaming applications. Stream's versatile Video SDK has been built with this webrtc-android library, and you can check out the tutorials below if you want to get more information.

Download

Maven Central

Gradle

Add the below dependency to your module's build.gradle file:

dependencies {
    implementation "io.getstream:stream-webrtc-android:1.1.2"
}

SNAPSHOT

See how to import the snapshot

Including the SNAPSHOT

Snapshots of the current development version of AvatarView are available, which track the latest versions.

To import snapshot versions on your project, add the code snippet below on your gradle file.

repositories {
   maven { url 'https://oss.sonatype.org/content/repositories/snapshots/' }
}

Next, add the below dependency to your module's build.gradle file.

dependencies {
    implementation "io.getstream:stream-webrtc-android:1.1.3-SNAPSHOT"
}

Usages

Once you import this library, you can use all of the org.webrtc packge functions, such as org.webrtc.PeerConnection and org.webrtc.VideoTrack. For more information, you can check out the API references for WebRTC packages.

Here are the most commonly used APIs in the WebRTC library, and you can reference the documentation below:

  • PeerConnection: Provides methods to create and set an SDP offer/answer, add ICE candidates, potentially connect to a remote peer, monitor the connection, and close the connection once it’s no longer needed.
  • PeerConnectionFactory: Create a PeerConnection instance.
  • EglBase: Holds EGL state and utility methods for handling an egl 1.0 EGLContext, an EGLDisplay, and an EGLSurface.
  • VideoTrack: Manages multiple VideoSink objects, which receive a stream of video frames in real-time and it allows you to control the VideoSink objects, such as adding, removing, enabling, and disabling.
  • VideoSource: Used to create video tracks and add VideoProcessor, which is a lightweight abstraction for an object that can receive video frames, process them, and pass them on to another object.
  • AudioTrack: Manages multiple AudioSink objects, which receive a stream of video frames in real-time and it allows you to control the AudioSink objects, such as adding, removing, enabling, and disabling.
  • AudioSource: Used to create audio tracks.
  • MediaStreamTrack: Java wrapper for a C++ MediaStreamTrackInterface.
  • IceCandidate: Representation of a single ICE Candidate, mirroring IceCandidateInterface in the C++ API.
  • SessionDescription: Description of an RFC 4566 Session. SDPs are passed as serialized Strings in Java-land and are materialized to SessionDescriptionInterface as appropriate in the JNI layer.
  • SurfaceViewRenderer: Display the video stream on a SurfaceView.
  • Camera2Capturer: The Camera2Capturer class is used to provide video frames for a VideoTrack (typically local) from the provided cameraId. Camera2Capturer must be run on devices Build.VERSION_CODES.LOLLIPOP or higher.
  • Camera2Enumerator

If you want to learn more about building a video chat application for Android using WebRTC, check out the blog post below:

WebRTC for UI Components

Maven Central

Stream WebRTC Android supports some useful UI components for WebRTC, such as VideoTextureViewRenderer. First, add the dependency below to your module's build.gradle file:

dependencies {
    implementation "io.getstream:stream-webrtc-android-ui:$version"
}

VideoTextureViewRenderer

VideoTextureViewRenderer is a custom TextureView that implements VideoSink and SurfaceTextureListener.

Usually, you can use SurfaceViewRenderer to display real-time video streams on a layout if you need a simple video call screen without overlaying video frames over another one. However, it might not work well as you expect if you suppose to need to design a complex video call screen, such as one video call layout should overlay another video call layout, such as the example below:

Screenshot

For this case, we'd recommend you use VideoTextureViewRenderer like the example below:

<io.getstream.webrtc.android.ui.VideoTextureViewRenderer
    android:id="@+id/participantVideoRenderer"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
 />

You can add or remove VideoTrack like the below:

videoTrack.video.addSink(participantVideoRenderer)
videoTrack.video.removeSink(participantVideoRenderer)

WebRTC for Jetpack Compose

Maven Central

Stream WebRTC Android supports some Jetpack Compose components for WebRTC, such as VideoRenderer and FloatingVideoRenderer. First, add the dependency below to your module's build.gradle file:

dependencies {
    implementation "io.getstream:stream-webrtc-android-compose:$version"
}

VideoRenderer

VideoRenderer is a composable function that renders a single video track in Jetpack Compose.

VideoRenderer(
    videoTrack = remoteVideoTrack,
    modifier = Modifier.fillMaxSize()
    eglBaseContext = eglBaseContext,
    rendererEvents = rendererEvents
)

You can observe the rendering state changes by giving RendererEvents interface like the below:

val rendererEvents = object : RendererEvents {
      override fun onFirstFrameRendered() { .. }
      override fun onFrameResolutionChanged(videoWidth: Int, videoHeight: Int, rotation: Int) { .. }
}

FloatingVideoRenderer

FloatingVideoRenderer represents a floating item that features a participant video, usually the local participant. You can use this composable function to overlay a single video track on another, and users can move the floating video track with user interactions.

You can use FloatingVideoRenderer with VideoRenderer like the example below:

var parentSize: IntSize by remember { mutableStateOf(IntSize(0, 0)) }

if (remoteVideoTrack != null) {
  VideoRenderer(
    videoTrack = remoteVideoTrack,
    modifier = Modifier
      .fillMaxSize()
      .onSizeChanged { parentSize = it },
    eglBaseContext = eglBaseContext,
    rendererEvents = rendererEvents
  )
}

if (localVideoTrack != null) {
  FloatingVideoRenderer(
    modifier = Modifier
      .size(width = 150.dp, height = 210.dp)
      .clip(RoundedCornerShape(16.dp))
      .align(Alignment.TopEnd),
    videoTrack = localVideoTrack,
    parentBounds = parentSize,
    paddingValues = PaddingValues(0.dp),
    eglBaseContext = eglBaseContexteglBaseContext,
    rendererEvents = rendererEvents
  )
}

WebRTC KTX

Maven Central

Stream WebRTC Android supports some useful extensions for WebRTC based on Kotlin's Coroutines. First, add the dependency below to your module's build.gradle file:

dependencies {
    implementation "io.getstream:stream-webrtc-android-ktx:$version"
}

addRtcIceCandidate

addRtcIceCandidate is a suspend function that allows you to add a given IceCandidate to a PeerConnection. So you can add an IceCandidate to a PeerConnection as Coroutines-style, not callback-style.

pendingIceMutex.withLock {
    pendingIceCandidates.forEach { iceCandidate ->
        connection.addRtcIceCandidate(iceCandidate)
    }
    pendingIceCandidates.clear()
}

createSessionDescription

You can create a SessionDescription, which delegates SdpObserver with Coroutines styles:

suspend fun createAnswer(): Result<SessionDescription> {
  return createSessionDescription { sdpObserver -> connection.createAnswer(sdpObserver, mediaConstraints) }
}

Instructions for Setting Up Chromium Dev Tool

This is an instruction for setting up Chromium Dev Tool if you need to compile the WebRTC core library by yourself with this project.

Chromium Dev Tools

  • You need to set up depot tools to build & fetch Chromium codebase.

  • You should fetch the chromium WebRTC repository from the Google's repository against HEAD commits.

Screenshot 2023-02-08 at 11 47 14 AM

Note: Chromium WebRTC core libraries can be bulit only in Linux OS. Every step takes its own time based on the machine specs and internet speed, so make sure every step is completed without interruption.

You need to set up AWS instance (pre-requiests):

  • Ubuntu 14.04 LTS (trusty with EoL April 2022)
  • 8 GB memory ram
  • At least 50 GB HDD/SSD storage

To compile the pre-built WebRTC library for Android, you must follow the steps below:

1. git clone https://chromium.googlesource.com/chromium/tools/depot_tools.git
    
2. export PATH="$PATH:${HOME}/depot_tools"
    
3. mkdir webrtc_android && cd webrtc_android
    
4. fetch --nohooks webrtc_android
        
5. gclient sync
    
6. cd src && ./build/install-build-deps.sh
    
7. git branch -r
    
8. git checkout origin/master
    
# To check you're in origin/master branch and check out to a specific branch if you want.
9. git branch

10. Replace Android sources & NDK/C/C++ files with this repository.

11. tools_webrtc/android/build_aar.py

To install all required dependencies for linux, a script is provided for Ubuntu, which is unfortunately only available after your first gclient sync and make sure your current directory is webrtc_android/src/:

cd src && ./build/install-build-deps.sh

You can see the available latest branches looks like the image below:

Screenshot 2023-02-14 at 5 26 32 PM

Now you can checkout to the latest branch which is branch-heads/m79 or something, using this command:

git checkout branch-heads/m79

However, this project reflects the latest updates for WebRTC, so you must check out to the master branch like this:

8. git checkout origin/master

This will help you to resolve most of compilation issues. To get the details about your current branch you can simply use these commands:

9. git branch

Using Manual Compilation:

This process will manually compile the source code for each particular CPU type. Manual Compiling involves these two steps:

  1. Generate projects using GN.
  2. Compile using Ninja.

This step will compile the library for Debug and Release modes of Development.

Ensure your current working directory is webrtc_android/src/ of your workspace. Then run:

11. gn gen out/Debug --args='target_os="android" target_cpu="arm"'
11. gn gen out/Release --args='is_debug=false is_component_build=false rtc_include_tests=false target_os="android" target_cpu="arm"'

You can specify a directory of your own choice instead of out/Debug, to enable managing multiple configurations in parallel.

  • To build for ARM64: use target_cpu="arm64"
  • To build for 32-bit x86: use target_cpu="x86"
  • To build for 64-bit x64: use target_cpu="x64"

For compilation you can simply use these following commands for (out/Debug, out/Release):

11. ninja -C out/Debug
11. ninja -C out/Release

Using AAR Build Tools:

This is the most simple process, which compiles the source code for all supported CPU types such as:

  • arm64-v8a
  • armeabi-v7a
  • x86
  • x86_64

After compiling the package, it includes all these native libraries and .jar library into *.aar file.

Make sure your current working directory is webrtc_android/src/ of your workspace. Then run:

11. tools_webrtc/android/build_aar.py

This process will take some time based on your machine specs and internet speed, so here we go:

image

Now, if you look in the webrtc_android/src/ directory, It turns out that you will end up with the compilation and building of libwebrtc.aar.

Find this Android library useful? 💙

Support it by joining stargazers for this repository. ⭐️
Also, follow maintainers on GitHub for our next creations! 🤩

License

Copyright 2023 Stream.IO, Inc. All Rights Reserved.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

   http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

webrtc-android's People

Contributors

alexkrupa avatar danielnovak avatar dependabot[bot] avatar jcesarmobile avatar skydoves avatar urielfrankel avatar yonghanju avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

webrtc-android's Issues

`VideoRenderer` throws `CalledFromWrongThreadException` when called

Hi,

I am building an application that pulls a video stream from a server via WebRTC and displays it on the screen. The server uses aiortc which I believe doesn’t support ICE trickling so all ICE candidates are gathered before initiating the connection.

Now, the whole process of sending an offer, receiving an answer and establishing a connection seems to work fine. However, when I call VideoRenderer with a VideoTrack taken from PeerConnection.Observer.onTrack, it crashes with the following stacktrace:

android.view.ViewRootImpl$CalledFromWrongThreadException: Only the original thread that created a view hierarchy can touch its views.
	at android.view.ViewRootImpl.checkThread(ViewRootImpl.java:9758)
	at android.view.ViewRootImpl.requestLayout(ViewRootImpl.java:1959)
	at android.view.View.requestLayout(View.java:26281)
	at android.view.View.requestLayout(View.java:26281)
	at android.view.View.requestLayout(View.java:26281)
	at android.view.View.requestLayout(View.java:26281)
	at android.view.View.requestLayout(View.java:26281)
	at androidx.compose.ui.platform.AndroidComposeView.scheduleMeasureAndLayout(AndroidComposeView.android.kt:778)
	at androidx.compose.ui.platform.AndroidComposeView.onRequestMeasure(AndroidComposeView.android.kt:834)
	at androidx.compose.ui.node.Owner.onRequestMeasure$default(Owner.kt:155)
	at androidx.compose.ui.node.LayoutNode.requestRemeasure$ui_release(LayoutNode.kt:1053)
	at androidx.compose.ui.node.LayoutNode.requestRemeasure$ui_release$default(LayoutNode.kt:1050)
	at androidx.compose.ui.platform.AndroidViewsHandler.requestLayout(AndroidViewsHandler.android.kt:91)
	at android.view.View.requestLayout(View.java:26281)
	at android.view.View.requestLayout(View.java:26281)
	at io.getstream.webrtc.android.ui.VideoTextureViewRenderer.updateFrameData(VideoTextureViewRenderer.kt:161)
	at io.getstream.webrtc.android.ui.VideoTextureViewRenderer.onFrame(VideoTextureViewRenderer.kt:133)

I have also tried with a plain VideoTextureViewRenderer to no avail, this time the screen being black with the following log:

EglRenderer: Dropping frame - Not initialized or already released.

I believe it is an issue with my code, because the official example works no problem. Unfortunately, I cannot pinpoint it and, being on it for a few days, would like to ask if there is perhaps some prerequisite for VideoRenderer to work and where might these issues come from?

Thanks for your attention!

Add option to change aspect of Image [VideoTextureViewRenderer]

I'm having trouble changing the aspect of the image with the VideoTextureViewRenderer, previously I was using setScalingType with fit and fill on the SurfaceViewRenderer, but now I'm not sure how to do it, maybe an option could be added to VideoTextureViewRenderer?

How to get rtt information

peerConnection?.getStats(RTCStatsCollectorCallback { rtcStatsReport ->
            val statsMap = rtcStatsReport.statsMap
            for ((key, value) in statsMap) {
                Log.d("==rtcStatsReport==","Key: $key, Value: $value")
            }
        })

========================
The output is as follows

Key: P, Value: { timestampUs: 1711951786002673, type: peer-connection, id: P, dataChannelsOpened: 0, dataChannelsClosed: 0 }

I can't find any information related to RTT

Crashes when trying to establish a peer to peer connection with a Firefox browser

I have an android application using this library (1.1.1) which successfully can establish a peer connection with Chrome, Microsoft Edge, and Safari but crashes when trying to set the remote description using an offer from Mozilla Firefox. The specific error refers to a failure somewhere in this library.

03-26 19:39:54.494 4223 4223 F DEBUG : Abort message: '../../../home/kanat/webrtc/src/buildtools/third_party/libc++/trunk/include/vector:571: assertion !empty() failed: front() called on an empty vector'

No ability to capture remote audio samples?

Hello, thank you for this awesome library! I'm trying to implement recording of a remote WebRTC stream. I'm able to capture raw video frames using the addSink method on VideoTrack and write them to a file. There doesn't appear a similar method on AudioTrack. Does this library provide the ability to process remote audio sample? Thanks in advance.

YUV/NV21 drawing

I have some rendering issue in fact isolated from WebRTC protocol, but related to this lib... I want to draw raw YUV frames, which are live video Full HD 30 fps obtained from some 3rd party

I've placed SurfaceViewRenderer in my Fragment and in onCreateView

    mSurfaceViewRenderer = rootView.findViewById(R.id.surfaceViewRenderer);
    mSurfaceViewRenderer.init(EglBase.create().getEglBaseContext(), null);

and I'm trying to feed this surface with prepared frames (callback called in own Thread, not main) like below:

@Override
public void onYuvDataByteBuffer(MediaFormat mediaFormat, ByteBuffer data, int dataSize, int width, int height) {
	int rowStrideY = width;
	int rowStrideU = width / 2;
	int rowStrideV = width / 2;

	int basicOffset = data.remaining() / 6;
	int offsetY = 0;
	int offsetU = basicOffset * 4;
	int offsetV = basicOffset * 5;

	ByteBuffer i420ByteBuffer = data;
	i420ByteBuffer.position(offsetY);
	final ByteBuffer dataY = i420ByteBuffer.slice();
	i420ByteBuffer.position(offsetU);
	final ByteBuffer dataU = i420ByteBuffer.slice();
	i420ByteBuffer.position(offsetV);
	final ByteBuffer dataV = i420ByteBuffer.slice();

	JavaI420Buffer javaI420Buffer = JavaI420Buffer.wrap(width, height,
			dataY, rowStrideY,
			dataU, rowStrideU,
			dataV, rowStrideV,
			null
			/*() -> {
					JniCommon.nativeFreeByteBuffer(i420ByteBuffer);
	}*/);

	VideoFrame frame = new VideoFrame(javaI420Buffer, 0, System.currentTimeMillis());
	mSurfaceViewRenderer.onFrame(frame);
}

yes, some ugly hardcodes for now, but it works, kind of... first frame is rendered properly, no issues. but second call will further cause a throw

FATAL EXCEPTION: SurfaceViewRendererEglRenderer
   Process: thats.my.package, PID: 12970
   java.lang.IllegalStateException: buffer is inaccessible
	at java.nio.DirectByteBuffer.slice(DirectByteBuffer.java:159)
	at org.webrtc.JavaI420Buffer.getDataY(JavaI420Buffer.java:118)
	at org.webrtc.VideoFrameDrawer$YuvUploader.uploadFromBuffer(VideoFrameDrawer.java:114)
	at org.webrtc.VideoFrameDrawer.drawFrame(VideoFrameDrawer.java:221)
	at org.webrtc.EglRenderer.renderFrameOnRenderThread(EglRenderer.java:664)
	at org.webrtc.EglRenderer.lambda$im8Sa54i366ODPy-soB9Bg4O-w4(Unknown Source:0)
	at org.webrtc.-$$Lambda$EglRenderer$im8Sa54i366ODPy-soB9Bg4O-w4.run(Unknown Source:2)
	at android.os.Handler.handleCallback(Handler.java:883)
	at android.os.Handler.dispatchMessage(Handler.java:100)
	at org.webrtc.EglRenderer$HandlerWithExceptionCallback.dispatchMessage(EglRenderer.java:103)
	at android.os.Looper.loop(Looper.java:214)
	at android.os.HandlerThread.run(HandlerThread.java:67

that happens with your dependency as well as with official current release (1.0.+), but I'm just glad I've found a space where I can post my problem and I do believe someone with knowledge will read this in here :)

so: is there a way to use WebRTC lib just for drawing YUV, as it contains all needed breadcrumbs for sure? Currently it looks to me like rendering part of code is tightened to protocol and API methods/ways for obtaining video (from stream or file, not from some custom callback). Maybe there is some sample somewhere showing built-in Camera preview rendered "manually", not by setPreviewSurface?

btw. more details and tries in my SO question, currently with bounty :)

AudioManager.setMicrophoneMute(mute) can cause SecurityException

If there are other user profiles on the same device, AudioManager.setMicrophoneMute() method can throw SecurityException with message: requires android.permission.INTERACT_ACROSS_USERS_FULL or android.permission.INTERACT_ACROSS_USERS to access user 0.

I don't understand this very well but it seems to me that microphone can't be release in this way and WebRTC doesn't have a better way to free microphone resource. Some other video conferencing apps keep using microphone even when muted, like Google Meet.

Do you have any thoughts on this?

CapturerObserver.onFrame leaks native memory in libjingle_peerconnection_so.so

We created a capturer class very similar to ScreenCapturerAndroid, that uses DisplayManager over the media projection api. However, I have found that calling CapturerObserver.onFrameCaptured creates a memory leak that Android studio profiler blames on libjingle_peerconnection_so.so.

If I simply comment out the CapturerObserver.onFrameCaptured step, my video track is now broken but the memory leak is gone.

I can provide screenshots, more info, etc, upon request. tried with version 1.1.1 and 1.1.0

ScreenCapturerAndroid.changeCaptureFormat sometimes causes cast to stop working.

Hello, we are currently using this lib to peer with a web client using ScreenCapturerAndroid to stream the user screen.

Every time the user rotates the phone we are calling changeCaptureFormat with new screen dimensions but sometimes (we are not able to find a pattern) this causes casting to stop working.

I attach some of the logs we see when this error occurs.

[SurfaceTexture-1-15488-0](id:3c8000000003,api:1,p:361,c:15488) connect: already connected (cur=1 req=1)
[SurfaceTexture-1-15488-0](id:3c8000000003,api:0,p:-1,c:15488) dequeueBuffer: BufferQueue has no connected producer
[SurfaceTexture-1-15488-0](id:3c8000000003,api:0,p:-1,c:15488) disconnect: not connected (req=1)

Any clue of what can be the issue or if this is an already known bug?

Thanks in advance.

Torch

Hello, I have been looking into this prebuilt library hoping we can get rid of building webRtc. But I'm unable to find how to control torch. In our current built version, there is a method void torch(boolean var1, TorchHandler var2); in CameraVideoCapturer. I'm unable to control it directly through CameraManager setTorchMode as camera is in use by then.

Crash when re-enabling camera

Hi
I used io.getstream:stream-webrtc-android:1.0.4 and io.getstream:stream-webrtc-android-compose:1.0.4 dependencies for my app. There is a scenario which causes a crash. In a call, when the device A disables camera and then enables it back, a crash happens on the device B due to throwing an exception with the following stack trace.
The crash happens for specific devices like Galaxy J7 Prime with Android 6.0.1.

# Fatal error in: gen/sdk/android/generated_metrics_jni/../../../../../../sdk/android/src/jni/jni_generator_helper.h, line 94
# last system error: 0
# Check failed: !env->ExceptionCheck()
#
Fatal signal 6 (SIGABRT), code -6 in tid 806 (IncomingVideoSt)
pid: 31918, tid: 806, name: IncomingVideoSt  >>> [package name] <<<
    #05 pc 0022445b  /data/app/[package name]-2/lib/arm/libjingle_peerconnection_so.so
    #06 pc 002243f1  /data/app/[package name]-2/lib/arm/libjingle_peerconnection_so.so
    #07 pc 0021327f  /data/app/[package name]-2/lib/arm/libjingle_peerconnection_so.so
    #08 pc 00334c39  /data/app/[package name]-2/lib/arm/libjingle_peerconnection_so.so
    #09 pc 002a863f  /data/app/[package name]-2/lib/arm/libjingle_peerconnection_so.so
    #10 pc 0043a5eb  /data/app/[package name]-2/lib/arm/libjingle_peerconnection_so.so
    #11 pc 00494413  /data/app/[package name]-2/lib/arm/libjingle_peerconnection_so.so
    #12 pc 002373b7  /data/app/[package name]-2/lib/arm/libjingle_peerconnection_so.so
    #13 pc 00237485  /data/app/[package name]-2/lib/arm/libjingle_peerconnection_so.so
    #14 pc 0040dde5  /data/app/[package name]-2/lib/arm/libjingle_peerconnection_so.so
    #15 pc 0040ec71  /data/app/[package name]-2/lib/arm/libjingle_peerconnection_so.so
    #16 pc 0040e1ab  /data/app/[package name]-2/lib/arm/libjingle_peerconnection_so.so
    #17 pc 0023df29  /data/app/[package name]-2/lib/arm/libjingle_peerconnection_so.so
    #18 pc 0023de9f  /data/app/[package name]-2/lib/arm/libjingle_peerconnection_so.so

[Question] Connecting android with web client

Hi, I am currently developing an android client(android native app) and a web client(js), which the android client will share its screen to the web client via webrtc and I am using this library.

In the web client I am using https://developer.mozilla.org/en-US/docs/Web/API/RTCPeerConnection
I encountered issue where after the android client calls .addTrack(), the web client does not trigger ontrack event, I am not sure what I did wrong and I have no idea how to debug/troubleshoot.

Asking if anyone have a working project example where connection is done between android client and a web client. Recommendations on how to troubleshoot /debug is greatly appreciated

Play audio in speaker like a video

We're adapting webrtc from one-to-one to one-to-many for a broadcast. I've got some issues about 'playing audio in speaker', Is it possible to adapt audio for Android to recognize it as video rather than a call?

[QUESTION] Merge strategy from latest Google libwebrtc

I have a question, we're looking for a pre-built version of libwebrtc for Android and want to know how to find out what the latest version of Google's libwebrtc is in this repo? Suppose there is a security vulnerability discovered in libwebrtc that's fixed in another commit, I would like to know if the vulnerability itself is present here and if the fix made it here. How can I know this information? Is there an official merging strategy here?

STUN binding request timed out

Hi,

We are encountering an unusual issue related to the STUN server.

Everything works correctly with the TURN server, but problems arise with the STUN server.

For our tests, we used the WebRTC in Jetpack Compose demo application without any modifications, including the default Google's STUN server URL.

In the same Wi-Fi network, everything works correctly. However, when the devices are in different Wi-Fi networks or using mobile network traffic, the issue shows up.

Here are the relevant logs:

2023-10-24 20:16:14.661 26592-26630 Call:WebRTC             io.getstream.webrtc.sample.compose   I  (network_thread - 26630:970) [onLogMessage] label: connection.cc, message: (line 1441): Conn[9f734db0:0:Net[wlan0:10.0.2.x/24:Wifi:id=5]:XKPDYGP8:1:0:local:udp:10.0.2.x:53538->XbDw1mmJ:1:1686052607:stun:udp:5.44.34.x:36306|C--I|-|0|0|7241540810645061119|-]: Sent STUN BINDING request, id=4c4d616c4351786f52676755, use_candidate=0, nomination=0
2023-10-24 20:16:14.713 26592-26630 Call:WebRTC             io.getstream.webrtc.sample.compose   I  (network_thread - 26630:970) [onLogMessage] label: connection.cc, message: (line 1441): Conn[9f734db0:0:Net[wlan0:10.0.2.x/24:Wifi:id=5]:XKPDYGP8:1:0:local:udp:10.0.2.x:53538->XbDw1mmJ:1:1686052607:stun:udp:5.44.34.x:36306|C--I|-|0|0|7241540810645061119|-]: Sent STUN BINDING request, id=6b31444d313446516b335866, use_candidate=0, nomination=0
2023-10-24 20:16:14.768 26592-26630 Call:WebRTC             io.getstream.webrtc.sample.compose   I  (network_thread - 26630:970) [onLogMessage] label: connection.cc, message: (line 1441): Conn[9f734db0:0:Net[wlan0:10.0.2.x/24:Wifi:id=5]:XKPDYGP8:1:0:local:udp:10.0.2.x:53538->XbDw1mmJ:1:1686052607:stun:udp:5.44.34.x:36306|C--I|-|0|0|7241540810645061119|-]: Sent STUN BINDING request, id=47437a425662305a4b386741, use_candidate=0, nomination=0
2023-10-24 20:16:14.818 26592-26630 Call:WebRTC             io.getstream.webrtc.sample.compose   I  (network_thread - 26630:970) [onLogMessage] label: connection.cc, message: (line 909): Conn[9f734db0:0:Net[wlan0:10.0.2.x/24:Wifi:id=5]:XKPDYGP8:1:0:local:udp:10.0.2.x:53538->XbDw1mmJ:1:1686052607:stun:udp:5.44.34.x:36306|C--I|-|0|0|7241540810645061119|-]: Timed out after 15017 ms without a response, rtt=6000
2023-10-24 20:16:38.538 26592-26630 Call:WebRTC             io.getstream.webrtc.sample.compose   E  (network_thread - 26630:970) [onLogMessage] label: stun_port.cc, message: (line 99): Binding request timed out from [0:0:0:x:x:x:x:x]:42822 (wlan0)
2023-10-24 20:16:38.539 26592-26632 Call:PeerConnection     io.getstream.webrtc.sample.compose   E  (signaling_threa - 26632:971) [onIceCandidateError] #sfu; #subscriber; event: IceCandidateErrorEvent(errorCode=701, STUN binding request timed out., address=[0:0:0:x:x:x:x:x], port=42822, url=stun:stun.l.google.com:19302)

To further investigate this issue, we decided to test the STUN server with Google's Trickle ICE tool. The results are as follows:

  1. On a computer, everything works correctly.
  2. On different Android devices, we encountered the same issue. Each ended up with the STUN binding timeout error. This was tested with different Wi-Fi networks and mobile network traffic.
  3. On iOS devices, everything works correctly.

I would greatly appreciate any advice or guidance.

Missing Changelog

Possible to include change logs for the releases and also if you could add which version or commit of the webRTC for Android from Google was compiled. Would be great to know what we're working with as we integrate these pre-builds 🙏

DataChannel example

Looks like PeerConnection.createDataChannel does not trigger PeerConnection.Observer.onDataChannel
Maybe I'm doing something wrong
So could you provide small example?

Voice

How to change the voice with Web RTC, let's say I want to increase the pitch for example.

Android 14: Pixel 7 Pro Black Screen

I use webrtc sdk for screen sharing, prior to Android 14, it works well. I provide some logs about webrtc.

NativeLibrary: Loading native library: jingle_peerconnection_so 2024-03-03 17:40:02.781 20868-20868 org.webrtc.Logging com.mobven.rtcbroadcasting I NativeLibrary: Loading library: jingle_peerconnection_so 2024-03-03 17:40:02.783 20868-20868 org.webrtc.Logging com.mobven.rtcbroadcasting I PeerConnectionFactory: PeerConnectionFactory was initialized without an injected Loggable. Any existing Loggable will be deleted. 2024-03-03 17:40:02.784 20868-20868 libEGL com.mobven.rtcbroadcasting D loaded /vendor/lib64/egl/libGLES_mali.so 2024-03-03 17:40:02.796 20868-21000 TrafficStats com.mobven.rtcbroadcasting D tagSocket(122) with statsTag=0xffffffff, statsUid=-1 2024-03-03 17:40:02.805 20868-20868 org.webrtc.Logging com.mobven.rtcbroadcasting I EglBase14Impl: Using OpenGL ES version 2 2024-03-03 17:40:02.814 20868-20868 Compatibil...geReporter com.mobven.rtcbroadcasting D Compat change id reported: 263076149; UID 10642; state: ENABLED 2024-03-03 17:40:02.826 20868-20868 org.webrtc.Logging com.mobven.rtcbroadcasting I WebRtcAudioManagerExternal: Sample rate is set to 48000 Hz 2024-03-03 17:40:02.826 20868-20868 org.webrtc.Logging com.mobven.rtcbroadcasting I WebRtcAudioManagerExternal: Sample rate is set to 48000 Hz 2024-03-03 17:40:02.826 20868-20868 org.webrtc.Logging com.mobven.rtcbroadcasting I JavaAudioDeviceModule: createAudioDeviceModule 2024-03-03 17:40:02.826 20868-20868 org.webrtc.Logging com.mobven.rtcbroadcasting I JavaAudioDeviceModule: HW NS will be used. 2024-03-03 17:40:02.826 20868-20868 org.webrtc.Logging com.mobven.rtcbroadcasting I JavaAudioDeviceModule: HW AEC will be used. 2024-03-03 17:40:02.826 20868-20868 org.webrtc.Logging com.mobven.rtcbroadcasting I WebRtcAudioEffectsExternal: ctor@[name=main, id=2] 2024-03-03 17:40:02.826 20868-20868 org.webrtc.Logging com.mobven.rtcbroadcasting I WebRtcAudioRecordExternal: ctor@[name=main, id=2] 2024-03-03 17:40:02.827 20868-20868 org.webrtc.Logging com.mobven.rtcbroadcasting I WebRtcAudioTrackExternal: ctor@[name=main, id=2] 2024-03-03 17:40:02.892 20868-21017 org.webrtc.Logging com.mobven.rtcbroadcasting I WebRtcAudioRecordExternal: enableBuiltInAEC(true) 2024-03-03 17:40:02.892 20868-21017 org.webrtc.Logging com.mobven.rtcbroadcasting I WebRtcAudioEffectsExternal: setAEC(true) 2024-03-03 17:40:02.892 20868-21017 org.webrtc.Logging com.mobven.rtcbroadcasting I WebRtcAudioRecordExternal: enableBuiltInNS(true) 2024-03-03 17:40:02.892 20868-21017 org.webrtc.Logging com.mobven.rtcbroadcasting I WebRtcAudioEffectsExternal: setNS(true) 2024-03-03 17:40:02.893 20868-20868 MLive_Socket com.mobven.rtcbroadcasting I Original Size: 1440 x 3120 2024-03-03 17:40:02.893 20868-20868 MLive_Socket com.mobven.rtcbroadcasting I Scaled Size: 411 x 891 2024-03-03 17:40:02.893 20868-21018 org.webrtc.Logging com.mobven.rtcbroadcasting I PeerConnectionFactory: onSignalingThreadReady 2024-03-03 17:40:02.893 20868-20868 MLive_Socket com.mobven.rtcbroadcasting I Original Size: 1440 x 3120 2024-03-03 17:40:02.893 20868-20868 MLive_Socket com.mobven.rtcbroadcasting I Scaled Size: 411 x 891 2024-03-03 17:40:02.893 20868-21017 org.webrtc.Logging com.mobven.rtcbroadcasting I PeerConnectionFactory: onWorkerThreadReady 2024-03-03 17:40:02.893 20868-21013 org.webrtc.Logging com.mobven.rtcbroadcasting I PeerConnectionFactory: onNetworkThreadReady 2024-03-03 17:40:02.894 20868-21023 org.webrtc.Logging com.mobven.rtcbroadcasting I EglBase14Impl: Using OpenGL ES version 2 2024-03-03 17:40:02.909 20868-20868 Compatibil...geReporter com.mobven.rtcbroadcasting D Compat change id reported: 269849258; UID 10642; state: ENABLED 2024-03-03 17:40:02.960 20868-21023 org.webrtc.Logging com.mobven.rtcbroadcasting I SurfaceTextureHelper: Setting listener to org.webrtc.ScreenCapturerAndroid@95595c5 2024-03-03 17:40:02.977 20868-21033 TrafficStats com.mobven.rtcbroadcasting D tagSocket(154) with statsTag=0xffffffff, statsUid=-1 2024-03-03 17:40:02.990 20868-21037 TrafficStats com.mobven.rtcbroadcasting D tagSocket(161) with statsTag=0xffffffff, statsUid=-1 2024-03-03 17:40:03.171 20868-21068 MLive_Socket com.mobven.rtcbroadcasting D broadcaster: 48c559227c7283c6 - 1440x3120 2024-03-03 17:40:12.150 20868-20883 rtcbroadcasting com.mobven.rtcbroadcasting I Background concurrent mark compact GC freed 87811(24MB) AllocSpace objects, 26(712KB) LOS objects, 84% free, 4557KB/28MB, paused 331us,6.245ms total 63.582ms 2024-03-03 17:40:12.263 20868-21109 MLive_Socket com.mobven.rtcbroadcasting D watcher call() called with: args = [[D_Q5xK933DMWAgW1AAD7]] 2024-03-03 17:40:12.323 20868-21018 MLive_Socket com.mobven.rtcbroadcasting D onRenegotiationNeeded() called 2024-03-03 17:40:12.328 20868-21018 MLive_Socket com.mobven.rtcbroadcasting D onCreateSuccess() called with: sessionDescription = [org.webrtc.SessionDescription@d9a7b58] 2024-03-03 17:40:12.346 20868-21018 ContentValues com.mobven.rtcbroadcasting D Change media description from: m=video 9 UDP/TLS/RTP/SAVPF 96 97 39 40 98 99 127 103 104 105 106 107 108 to m=video 9 UDP/TLS/RTP/SAVPF 96 97 39 40 98 99 127 103 104 105 106 107 108 2024-03-03 17:40:12.422 20868-21018 MLive_Socket com.mobven.rtcbroadcasting D onSignalingChange() called with: signalingState = [HAVE_LOCAL_OFFER] 2024-03-03 17:40:12.452 20868-21018 MLive_Socket com.mobven.rtcbroadcasting D onSetSuccess() called 2024-03-03 17:40:12.458 20868-21013 org.webrtc.Logging com.mobven.rtcbroadcasting I NetworkMonitor: Start monitoring with native observer -5476376641349339248 fieldTrialsString: 2024-03-03 17:40:12.514 20868-21013 org.webrtc.Logging com.mobven.rtcbroadcasting W NetworkMonitorAutoDetect: Unable to obtain permission to request a cellular network. 2024-03-03 17:40:12.535 20868-21112 org.webrtc.Logging com.mobven.rtcbroadcasting I NetworkMonitorAutoDetect: Network handle: 1210291507213 becomes available: 281 2024-03-03 17:40:12.538 20868-21112 org.webrtc.Logging com.mobven.rtcbroadcasting I NetworkMonitorAutoDetect: handle: 1210291507213 capabilities changed: [ Transports: WIFI Capabilities: NOT_METERED&INTERNET&NOT_RESTRICTED&TRUSTED&NOT_VPN&VALIDATED&NOT_ROAMING&FOREGROUND&NOT_CONGESTED&NOT_SUSPENDED&NOT_VCN_MANAGED LinkUpBandwidth>=15648Kbps LinkDnBandwidth>=20989Kbps TransportInfo: <SSID: <unknown ssid>, BSSID: 02:00:00:00:00:00, MAC: 02:00:00:00:00:00, IP: /192.168.1.104, Security type: 2, Supplicant state: COMPLETED, Wi-Fi standard: 4, RSSI: -64, Link speed: 86Mbps, Tx Link speed: 86Mbps, Max Supported Tx Link speed: 300Mbps, Rx Link speed: 130Mbps, Max Supported Rx Link speed: 300Mbps, Frequency: 2427MHz, Net ID: -1, Metered hint: false, score: 60, isUsable: true, CarrierMerged: false, SubscriptionId: -1, IsPrimary: -1, Trusted: true, Restricted: false, Ephemeral: false, OEM paid: false, OEM private: false, OSU AP: false, FQDN: <none>, Provider friendly name: <none>, Requesting package name: <none><none>MLO Information: , Is TID-To-Link negotiation supported by the AP: false, AP MLD Address: <none>, AP MLO Link Id: <none>, AP MLO Affiliated links: <none>> SignalStrength: -64 UnderlyingNetworks: Null] 2024-03-03 17:40:12.539 20868-21112 org.webrtc.Logging com.mobven.rtcbroadcasting I NetworkMonitorAutoDetect: handle: 1210291507213 link properties changed 2024-03-03 17:40:12.579 20868-21018 MLive_Socket com.mobven.rtcbroadcasting D emitOffer() called with: message = [{"type":"offer","sdp":"v=0\r\no=- 3359927949129100775 2 IN IP4 127.0.0.1\r\ns=-\r\nt=0 0\r\na=group:BUNDLE 0\r\na=extmap-allow-mixed\r\na=msid-semantic: WMS ARDAMS\r\nm=video 9 UDP\/TLS\/RTP\/SAVPF 96 97 39 40 98 99 127 103 104 105 106 107 108\r\nc=IN IP4 0.0.0.0\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=ice-ufrag:TMl8\r\na=ice-pwd:TXj5C1K8rU3vwc4PJg4I7VE6\r\na=ice-options:trickle renomination\r\na=fingerprint:sha-256 E8:AD:5D:1E:63:C8:DE:77:75:1A:6F:3A:7D:A6:C8:05:0A:D9:75:50:03:EE:D8:7B:6B:B8:80:36:0A:57:4A:93\r\na=setup:actpass\r\na=mid:0\r\na=extmap:1 urn:ietf:params:rtp-hdrext:toffset\r\na=extmap:2 http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/abs-send-time\r\na=extmap:3 urn:3gpp:video-orientation\r\na=extmap:4 http:\/\/www.ietf.org\/id\/draft-holmer-rmcat-transport-wide-cc-extensions-01\r\na=extmap:5 http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/playout-delay\r\na=extmap:6 http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/video-content-type\r\na=extmap:7 http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/video-timing\r\na=extmap:8 http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/color-space\r\na=extmap:9 urn:ietf:params:rtp-hdrext:sdes:mid\r\na=extmap:10 urn:ietf:params:rtp-hdrext:sdes:rtp-stream-id\r\na=extmap:11 urn:ietf:params:rtp-hdrext:sdes:repaired-rtp-stream-id\r\na=sendrecv\r\na=msid:ARDAMS ARDAMSv0\r\na=rtcp-mux\r\na=rtcp-rsize\r\na=rtpmap:96 VP8\/90000\r\na=rtcp-fb:96 goog-remb\r\na=rtcp-fb:96 transport-cc\r\na=rtcp-fb:96 ccm fir\r\na=rtcp-fb:96 nack\r\na=rtcp-fb:96 nack pli\r\na=rtpmap:97 rtx\/90000\r\na=fmtp:97 apt=96\r\na=rtpmap:39 AV1\/90000\r\na=rtcp-fb:39 goog-remb\r\na=rtcp-fb:39 transport-cc\r\na=rtcp-fb:39 ccm fir\r\na=rtcp-fb:39 nack\r\na=rtcp-fb:39 nack pli\r\na=rtpmap:40 rtx\/90000\r\na=fmtp:40 apt=39\r\na=rtpmap:98 VP9\/90000\r\na=rtcp-fb:98 goog-remb\r\na=rtcp-fb:98 transport-cc\r\na=rtcp-fb:98 ccm fir\r\na=rtcp-fb:98 nack\r\na=rtcp-fb:98 nack pli\r\na=fmtp:98 profile-id=0\r\na=rtpmap:99 rtx\/90000\r\na=fmtp:99 apt=98\r\na=rtpmap:127 H264\/90000\r\na=rtcp-fb:127 goog-remb\r\na=rtcp-fb:127 transport-cc\r\na=rtcp-fb:127 ccm fir\r\na=rtcp-fb:127 nack\r\na=rtcp-fb:127 nack pli\r\na=fmtp:127 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=42e01f\r\na=rtpmap:103 rtx\/90000\r\na=fmtp:103 apt=127\r\na=rtpmap:104 H265\/90000\r\na=rtcp-fb:104 goog-remb\r\na=rtcp-fb:104 transport-cc\r\na=rtcp-fb:104 ccm fir\r\na=rtcp-fb:104 nack\r\na=rtcp-fb:104 nack pli\r\na=rtpmap:105 rtx\/90000\r\na=fmtp:105 apt=104\r\na=rtpmap:106 red\/90000\r\na=rtpmap:107 rtx\/90000\r\na=fmtp:107 apt=106\r\na=rtpmap:108 ulpfec\/90000\r\na=ssrc-group:FID 922775563 3496131644\r\na=ssrc:922775563 cname:HH0Za2G5z7W+Vx56\r\na=ssrc:922775563 msid:ARDAMS ARDAMSv0\r\na=ssrc:3496131644 cname:HH0Za2G5z7W+Vx56\r\na=ssrc:3496131644 msid:ARDAMS ARDAMSv0\r\n"}] 2024-03-03 17:40:12.580 20868-21018 MLive_Socket com.mobven.rtcbroadcasting D onIceGatheringChange() called with: iceGatheringState = [GATHERING] 2024-03-03 17:40:12.676 20868-21167 MLive_Socket com.mobven.rtcbroadcasting D GetSizes Called 2024-03-03 17:40:12.744 20868-21168 MLive_Socket com.mobven.rtcbroadcasting D answer call() called with: args = [[D_Q5xK933DMWAgW1AAD7, {"type":"answer","sdp":"v=0\r\no=- 7009070077714946209 2 IN IP4 127.0.0.1\r\ns=-\r\nt=0 0\r\na=group:BUNDLE 0\r\na=extmap-allow-mixed\r\na=msid-semantic: WMS\r\nm=video 9 UDP\/TLS\/RTP\/SAVPF 96 97 39 40 98 99 127 103 106 107 108\r\nc=IN IP4 0.0.0.0\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=ice-ufrag:cEKU\r\na=ice-pwd:VY7Lj9ip5lmJhhmUl\/+4wWi0\r\na=ice-options:trickle\r\na=fingerprint:sha-256 C4:09:C9:13:76:E1:34:4B:44:0A:A0:DC:F1:BF:D5:C9:C6:98:32:E1:6F:C4:20:24:EA:B9:B5:0D:D9:08:AB:49\r\na=setup:active\r\na=mid:0\r\na=extmap:1 urn:ietf:params:rtp-hdrext:toffset\r\na=extmap:2 http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/abs-send-time\r\na=extmap:3 urn:3gpp:video-orientation\r\na=extmap:4 http:\/\/www.ietf.org\/id\/draft-holmer-rmcat-transport-wide-cc-extensions-01\r\na=extmap:5 http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/playout-delay\r\na=extmap:6 http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/video-content-type\r\na=extmap:7 http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/video-timing\r\na=extmap:8 http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/color-space\r\na=extmap:9 urn:ietf:params:rtp-hdrext:sdes:mid\r\na=extmap:10 urn:ietf:params:rtp-hdrext:sdes:rtp-stream-id\r\na=extmap:11 urn:ietf:params:rtp-hdrext:sdes:repaired-rtp-stream-id\r\na=recvonly\r\na=rtcp-mux\r\na=rtcp-rsize\r\na=rtpmap:96 VP8\/90000\r\na=rtcp-fb:96 goog-remb\r\na=rtcp-fb:96 transport-cc\r\na=rtcp-fb:96 ccm fir\r\na=rtcp-fb:96 nack\r\na=rtcp-fb:96 nack pli\r\na=rtpmap:97 rtx\/90000\r\na=fmtp:97 apt=96\r\na=rtpmap:39 AV1\/90000\r\na=rtcp-fb:39 goog-remb\r\na=rtcp-fb:39 transport-cc\r\na=rtcp-fb:39 ccm fir\r\na=rtcp-fb:39 nack\r\na=rtcp-fb:39 nack pli\r\na=rtpmap:40 rtx\/90000\r\na=fmtp:40 apt=39\r\na=rtpmap:98 VP9\/90000\r\na=rtcp-fb:98 goog-remb\r\na=rtcp-fb:98 transport-cc\r\na=rtcp-fb:98 ccm fir\r\na=rtcp-fb:98 nack\r\na=rtcp-fb:98 nack pli\r\na=fmtp:98 profile-id=0\r\na=rtpmap:99 rtx\/90000\r\na=fmtp:99 apt=98\r\na=rtpmap:127 H264\/90000\r\na=rtcp-fb:127 goog-remb\r\na=rtcp-fb:127 transport-cc\r\na=rtcp-fb:127 ccm fir\r\na=rtcp-fb:127 nack\r\na=rtcp-fb:127 nack pli\r\na=fmtp:127 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=42e01f\r\na=rtpmap:103 rtx\/90000\r\na=fmtp:103 apt=127\r\na=rtpmap:106 red\/90000\r\na=rtpmap:107 rtx\/90000\r\na=fmtp:107 apt=106\r\na=rtpmap:108 ulpfec\/90000\r\n"}]] 2024-03-03 17:40:12.752 20868-21168 ContentValues com.mobven.rtcbroadcasting D Change media description from: m=video 9 UDP/TLS/RTP/SAVPF 96 97 39 40 98 99 127 103 106 107 108 to m=video 9 UDP/TLS/RTP/SAVPF 96 97 39 40 98 99 127 103 106 107 108 2024-03-03 17:40:12.759 20868-21018 MLive_Socket com.mobven.rtcbroadcasting D onSignalingChange() called with: signalingState = [STABLE] 2024-03-03 17:40:12.763 20868-21018 MLive_Socket com.mobven.rtcbroadcasting D onIceConnectionChange() called with: iceConnectionState = [CHECKING] 2024-03-03 17:40:12.763 20868-21018 MLive_Socket com.mobven.rtcbroadcasting D onSetSuccess() called 2024-03-03 17:40:12.860 20868-21018 MLive_Socket com.mobven.rtcbroadcasting D onIceCandidate() called with: iceCandidate = [0:0:candidate:4240047502 1 udp 41885695 37.139.6.247 32848 typ relay raddr 0.0.0.0 rport 0 generation 0 ufrag TMl8 network-id 4 network-cost 10:turn:momentumv2.mobven.com:3478?transport=udp:UNKNOWN] 2024-03-03 17:40:12.861 20868-21018 MLive_Socket com.mobven.rtcbroadcasting D emitIceCandidate() called with: message = [{"sdpMLineIndex":0,"sdpMid":"0","candidate":"candidate:4240047502 1 udp 41885695 37.139.6.247 32848 typ relay raddr 0.0.0.0 rport 0 generation 0 ufrag TMl8 network-id 4 network-cost 10"}] 2024-03-03 17:40:12.939 20868-21172 MLive_Socket com.mobven.rtcbroadcasting D candidate call() called with: args = [[D_Q5xK933DMWAgW1AAD7, {"candidate":"candidate:1549432407 1 udp 33562623 37.139.6.247 32803 typ relay raddr 0.0.0.0 rport 0 generation 0 ufrag cEKU network-cost 999","sdpMid":"0","sdpMLineIndex":0,"usernameFragment":"cEKU"}]] 2024-03-03 17:40:13.230 20868-21018 MLive_Socket com.mobven.rtcbroadcasting D onIceConnectionChange() called with: iceConnectionState = [CONNECTED]

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.