stasel / webrtc Goto Github PK
View Code? Open in Web Editor NEWUnofficial distribution of up to date WebRTC framework binaries for iOS and macOS
License: Other
Unofficial distribution of up to date WebRTC framework binaries for iOS and macOS
License: Other
Does this library support H265 ?
Hello!
Is there any way to build a XCFramework for iOS with static linking? I tried to do it myself using build.sh
, but it didn't help - the result was with dynamic linking.
Hi, @stasel
I'm trying to run my app using M95 release on my Macbook with M1 CPU and I always get this:
Undefined symbols for architecture arm64:
"_OBJC_CLASS_$ _RTCMTLVideoView"
Or this if I'm using rosetta:
Undefined symbols for architecture x86_64:
"_OBJC_CLASS_$_RTCEAGLVideoView"
I'm using this code to create renderer:
#if arch(arm64)
let renderer: RTCMTLVideoView = RTCMTLVideoView()
renderer.videoContentMode = contentMode
#else
let renderer: RTCEAGLVideoView = RTCEAGLVideoView()
#endif
My app works fine on iOS, Simulators and on my Mac if app runs as "Designed for iPad", but I have no luck running it with Mac Catalyst. What am I doing wrong? Or it's just M1 issue?
Hello. Our project has multiple targets. The Add Package UI allows adding to a single target but it's straightforward to add packages to other targets later, except this package. I'm not sure what's different.
Here's our project and packages:
When I initially add the package it gets added to the app
target. For the other targets we just add the packages through the Frameworks, Libraries, and Embedded Content in the General tab for the target:
When you click the Add button all the other packages show up but not WebRTC:
One possible clue: I looked through all thePackage.swift files and the only significant difference is yours is the only one with .binaryTarget
in the targets
array.
Note that it seems possible to add the package through the Add Other... button but when you choose Add Package Dependency... there it's the original Add Package UI and I'm reluctant to go that way because I think we'll end up with another copy of WebRTC which is not what we want.
Hi. I am trying to install a manually downloaded WebRTC. I create an empty project, add xcframework
to my Frameworks, Libraries, and Embedded content
and just run the empty project. I am getting these errors:
dyld[3164]: Library not loaded: @rpath/WebRTC.framework/WebRTC
Referenced from: /private/var/containers/Bundle/Application/D226FD5F-C2E7-432E-A62C-8E8B2EC6EA7E/WebRTCTest.app/WebRTC dynamic test
Reason: tried: '/private/var/containers/Bundle/Application/D226FD5F-C2E7-432E-A62C-8E8B2EC6EA7E/WebRTCTest.app/Frameworks/WebRTC.framework/WebRTC' (no such file), '/private/var/containers/Bundle/Application/D226FD5F-C2E7-432E-A62C-8E8B2EC6EA7E/WebRTCTest.app/Frameworks/WebRTC.framework/WebRTC' (no such file), '/System/Library/Frameworks/WebRTC.framework/WebRTC' (no such file)
Library not loaded: @rpath/WebRTC.framework/WebRTC
Referenced from: /private/var/containers/Bundle/Application/D226FD5F-C2E7-432E-A62C-8E8B2EC6EA7E/WebRTCTest.app/WebRTCTest
Reason: tried: '/private/var/containers/Bundle/Application/D226FD5F-C2E7-432E-A62C-8E8B2EC6EA7E/WebRTCTest.app/Frameworks/WebRTC.framework/WebRTC' (no such file), '/private/var/containers/Bundle/Application/D226FD5F-C2E7-432E-A62C-8E8B2EC6EA7E/WebRTCTest.app/Frameworks/WebRTC.framework/WebRTC' (no such file), '/System/Library/Frameworks/WebRTC.framework/WebRTC' (no such file)
Do I need to do something else that's not written in the instructions?
Update.
I changed to Embed and sign, and the project works, but when I try to add WebRTC.framework to my framework and it does not work. The same issue
Would Swift's C++ interop make it possible to use the WebRTC source code directly in an Xcode project instead of a compiled binary? If this works, I imagine it would result in better type intelligence in Xcode.
Thanks for great job. Could you add support build script for tvOS?
Hey Stasel,
I am very thankful for this implementation of WebRTC on IOS. I am currently trying to build your WebRTCIOS App as a viewcontroller for my superapp. I have refactored the webrtcclient inilizising into my viewcontroller as well as the signnalclient and websocketProvider. I have literally copy pasted your code and just renamed a few files and classes bcs they conflicted with my other Websockets. Do you have any idea what I am doing wrong here ?
It crashes at because of Fatal error: init(coder:) has not been implemented
import UIKit
import WebRTC
class waitingViewController: UIViewController {
private let signalClient: SignalingClient
private var webRTCClient: WebRTCClient = WebRTCClient(iceServers: Config.default.webRTCIceServers)
let webSocketProvider: WebSocketProvider = WorkoutSocket(url: Config.default.signalingServerUrl)
@IBOutlet weak var videoView: UIView!
@IBAction func StartTalkButton(_ sender: Any) {
}
private var signalingConnected: Bool = false {
didSet {
DispatchQueue.main.async {
if self.signalingConnected {
print("Connected")
// self.signalingStatusLabel?.text = "Connected"
// self.signalingStatusLabel?.textColor = UIColor.green
}
else {
print("Disconnected")
// self.signalingStatusLabel?.text = "Not connected"
// self.signalingStatusLabel?.textColor = UIColor.red
}
}
}
}
private var hasLocalSdp: Bool = false {
didSet {
// DispatchQueue.main.async {
// self.localSdpStatusLabel?.text = self.hasLocalSdp ? "✅" : "❌"
// }
}
}
private var localCandidateCount: Int = 0 {
didSet {
// DispatchQueue.main.async {
// self.localCandidatesLabel?.text = "\(self.localCandidateCount)"
// }
}
}
private var hasRemoteSdp: Bool = false {
didSet {
// DispatchQueue.main.async {
// self.remoteSdpStatusLabel?.text = self.hasRemoteSdp ? "✅" : "❌"
// }
}
}
private var remoteCandidateCount: Int = 0 {
didSet {
// DispatchQueue.main.async {
// self.remoteCandidatesLabel?.text = "\(self.remoteCandidateCount)"
// }
}
}
private var speakerOn: Bool = false {
didSet {
// let title = "Speaker: \(self.speakerOn ? "On" : "Off" )"
// self.speakerButton?.setTitle(title, for: .normal)
}
}
private var mute: Bool = false {
didSet {
// let title = "Mute: \(self.mute ? "on" : "off")"
// self.muteButton?.setTitle(title, for: .normal)
}
}
init(signalClient: SignalingClient, webRTCClient: WebRTCClient) {
self.signalClient = signalClient
self.webRTCClient = webRTCClient
super.init(nibName: String(describing: waitingViewController.self), bundle: Bundle.main)
}
@available(*, unavailable)
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
override func viewDidLoad() {
super.viewDidLoad()
self.title = "WebRTC Demo"
self.signalingConnected = false
self.hasLocalSdp = false
self.hasRemoteSdp = false
self.localCandidateCount = 0
self.remoteCandidateCount = 0
self.speakerOn = false
self.webRTCClient.delegate = self
self.signalClient.delegate = self
self.signalClient.connect()
}
}
extension waitingViewController: SignalClientDelegate {
func signalClient(_ signalClient: SignalingClient, didReceiveRemoteSdp sdp: RTCSessionDescription) {
print("Received remote sdp")
self.webRTCClient.set(remoteSdp: sdp) { (error) in
self.hasRemoteSdp = true
}
}
func signalClientDidConnect(_ signalClient: SignalingClient) {
self.signalingConnected = true
}
func signalClientDidDisconnect(_ signalClient: SignalingClient) {
self.signalingConnected = false
}
func signalClient(_ signalClient: SignalingClient, didReceiveCandidate candidate: RTCIceCandidate) {
print("Received remote candidate")
self.remoteCandidateCount += 1
self.webRTCClient.set(remoteCandidate: candidate)
}
}
extension waitingViewController: WebRTCClientDelegate {
func webRTCClient(_ client: WebRTCClient, didDiscoverLocalCandidate candidate: RTCIceCandidate) {
print("discovered local candidate")
self.localCandidateCount += 1
self.signalClient.send(candidate: candidate)
}
func webRTCClient(_ client: WebRTCClient, didChangeConnectionState state: RTCIceConnectionState) {
let textColor: UIColor
switch state {
case .connected, .completed:
textColor = .green
case .disconnected:
textColor = .orange
case .failed, .closed:
textColor = .red
case .new, .checking, .count:
textColor = .black
@unknown default:
textColor = .black
}
DispatchQueue.main.async {
// self.webRTCStatusLabel?.text = state.description.capitalized
// self.webRTCStatusLabel?.textColor = textColor
}
}
func webRTCClient(_ client: WebRTCClient, didReceiveData data: Data) {
DispatchQueue.main.async {
let message = String(data: data, encoding: .utf8) ?? "(Binary: \(data.count) bytes)"
let alert = UIAlertController(title: "Message from WebRTC", message: message, preferredStyle: .alert)
alert.addAction(UIAlertAction(title: "OK", style: .cancel, handler: nil))
self.present(alert, animated: true, completion: nil)
}
}
}
When I try to connect to a room with > 4 participants
setlocaldescription the whole application hangs at all
The more connections, the faster it hangs.
Hello! First of all thank you for maintaining this repo.
I have a question about build script. I've tried to build it my own, but unfortunately with no luck.
IOS_64_BIT=true BUILD_VP9=true BITCODE=true sh build.sh
Can you provide some instructions on how to use the script to correctly build iOS framework with bitcode enabled?
Thanks!
Hi,
Thank you very much for this job. But can you explain to me why the M94 is the last version to support 32-bit ?
Thx,
Jeremie.
Can anyone add support for watchOS for watchOS9. CallKit support
👋
I think this might be SPM issue but still, I wonder if you know of a way to stop SPM downloading webrtc binaries each time project.pbxproj
changes (or it seems even when it just changes last modified date) which happens each time I merge/rebase branches? Maybe there is some kind of check that could be done on SPM level without downloading the binaries?
I've tried pinning the package to commit ref but it's still being redownloaded after pulling updates for a branch that include project.pbxproj
changes.
I've also asked that question on the forum.
Library not loaded: @rpath/WebRTC.framework/WebRTC
Referenced from: <E867BA81-A8CB-3F44-8389-6186947C4F5A> /***/Library/Developer/Xcode/DerivedData/RTCTest-egwdzqwcibwomgdqnalvxbmtovoh/Build/Products/Debug-iphonesimulator/RTCTestTests.xctest/RTCTestTests
Reason: tried: '/***/Library/Developer/Xcode/DerivedData/RTCTest-egwdzqwcibwomgdqnalvxbmtovoh/Build/Products/Debug-iphonesimulator/WebRTC.framework/WebRTC' (code signature in <4C4C441D-5555-3144-A11C-D7D3A9440C26> '/***/Library/Developer/Xcode/DerivedData/RTCTest-egwdzqwcibwomgdqnalvxbmtovoh/Build/Products/Debug-iphonesimulator/WebRTC.framework/WebRTC' not valid for use in process: Trying to load an unsigned library)
Here is the sample project
https://github.com/bofeizhu/RTCTest
M1 Chip, Xcode 14.1, Run test with iPhone simulator
WebRTC-lib: 122.0.0
Xcode: 15.2
iPhone 15 Pro Max with iOS 17.3.1
Installed via CocoaPods. Crash on app launch from Xcode on iPhone.
Xcode console prints out:
dyld[5585]: Symbol not found: _OBJC_CLASS_$_RTCEAGLVideoView
Referenced from: <F1B19E73-4258-3DFD-9052-291E2C1D51ED> /private/var/containers/Bundle/Application/3772A116-EDB7-423A-AF20-0CD42A5B6ED4/XXX.app/Frameworks/MVWebRTCInterface.framework/MVWebRTCInterface
Expected in: <4C4C4465-5555-3144-A195-0511A1886791> /private/var/containers/Bundle/Application/3772A116-EDB7-423A-AF20-0CD42A5B6ED4/XXX.app/Frameworks/WebRTC.framework/WebRTC
Hi,
I'm trying to link with WebRTC when building a Unit Test Bundle. The build process succeeds but when the Unit Test Bundle launches, it immediately fails with this error on my M1 Mac.
NOTE: Non-M1 Macs do not have this issue so I'm hoping there is a build configuration/setting that fixes this.
Assertions: System: Failed to load the test bundle.
(0x0109): Library not loaded: @rpath/WebRTC.framework/WebRTC
Reason: tried: '/.../Debug-iphonesimulator/WebRTC.framework/WebRTC'
(code signature in <> '/.../Debug-iphonesimulator/WebRTC.framework/WebRTC' not valid for use in process: Trying to load an unsigned library),
'/.../Debug-iphonesimulator/PackageFrameworks/WebRTC.framework/WebRTC' (errno=2),
...
Hey there, I know this is just an unofficial distribution repository, but hopefully somebody might be able to help me.
I'm trying to setup a perfect negotiation technique for establishing a connection. To achieve that I need one of the users to rollback a local offer and accept a remote offer. I use this api:
peerConnection.setLocalDescription(RTCSessionDescription(type: .rollback, sdp: RTCSessionDescription.string(for: .rollback))) {
self.peerConnection.setRemoteDescription(remoteSdp)...
}
After setting remote sdp in the completion handler above I get this error: Failed to set remote offer sdp: Called in wrong state: have-local-offer
. In the rollback completion handler the state of the peerConnection still remains as have-local-offer. For some reason rollback function doesn't work, and users end up in have-local-offer state. Does anyone know how to rollback local offer properly?
Hi, nice work.
Can you do a build with VP9 support?
Thanks
I don't think it should be far away from building one for Apple Vision as it should be similar to iOS. I am not asking for Apple Vision running as iPad app, I am asking for Apple Vision itself and simulator. Please compile a framework for Apple Vision please.
I recently switched from the old Google pod to a binary. I'm getting a rejection when trying to upload a build to the app store:
The app references non-public selectors in ... initWithURLStrings:, receiver, sdp, setIsAudioEnabled:, setIsEnabled:, videoSource
This seems absurd. They're all methods related to WebRTC in RTCAudioSession, RTCIceServer, RTCPeerConnectionFactory etc. I can't imagine why they would have worked previously but not now.
Has anyone else encountered this?
I am using a custom video capturer that creates RTCVideoFrame(s) from a CGImage converted to a CVPixelBuffer (to initialize the RTCVideoFrame):
func processSampleBuffer(_ pixelBuffer: CVPixelBuffer) {
let rtcpixelBuffer = RTCCVPixelBuffer(pixelBuffer: pixelBuffer)
let timeStampSeconds = CACurrentMediaTime()
let timeStampNs = lroundf(Float(timeStampSeconds * Double(NSEC_PER_SEC)))
let videoFrame = RTCVideoFrame(buffer: rtcpixelBuffer, rotation: RTCVideoRotation._0, timeStampNs: Int64(timeStampNs))
if let videoCapturer = videoCapturer {
videoCapturer.delegate?.capturer(videoCapturer, didCapture: videoFrame)
}
}
After a few seconds (might happen when the video resolution changes to 640x360) but not sure, then I get this crash (hard to debug for me).
Any idea?
By the way, a very similar code works great using your iOS demo sample.
Do you have a macOS version of that sample?
Laurent
Hello, i'm trying to generate a framework using your build.sh script but i keep getting the same error and i'm not sure how to solve it.
It seems that it manages to execute the gn gen
command, but then it fails when trying to build with Ninja. The error message ld64.lld: error: unable to find matching architecture in ../../third_party/llvm-build/Release+Asserts/lib/clang/14.0.0/lib/darwin/libclang_rt.iossim.a
is the same one i got when using these commands to build the framework without the script.
This is the message i get:
Done. Made 1187 targets from 244 files in 1276ms
ninja: Entering directory `./out/ios-arm64-simulator'
[3437/3445] SOLINK obj/sdk/arm64/WebRTC obj/sdk/arm64/WebRTC.TOC
FAILED: obj/sdk/arm64/WebRTC obj/sdk/arm64/WebRTC.TOC
if [ ! -e "obj/sdk/arm64/WebRTC" -o ! -e "obj/sdk/arm64/WebRTC.TOC" ] || ../../../../../../../../Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/otool -l "obj/sdk/arm64/WebRTC" | grep -q LC_REEXPORT_DYLIB ; then TOOL_VERSION=1641494109 ../../build/toolchain/apple/linker_driver.py -Wcrl,strippath,../../../../../../../../Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/strip ../../third_party/llvm-build/Release+Asserts/bin/clang++ -shared -all_load -install_name @rpath/WebRTC.framework/WebRTC -Wl,-install_name,@rpath/WebRTC.framework/WebRTC -fuse-ld=lld -Wl,-fatal_warnings -Wl,--icf=all -Wl,--color-diagnostics -target arm64-apple-ios12.0-simulator -no-canonical-prefixes -Werror -Wl,-dead_strip -nostdlib++ -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator15.2.sdk -Wl,-ObjC -Wl,-rpath,@executable_path/Frameworks -Wl,-rpath,@loader_path/Frameworks -L/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator15.2.sdk/usr/lib/swift -L/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/lib/swift/iphonesimulator -o "obj/sdk/arm64/WebRTC" "@obj/sdk/arm64/WebRTC.rsp" && { ../../../../../../../../Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/otool -l "obj/sdk/arm64/WebRTC" | grep LC_ID_DYLIB -A 5; ../../../../../../../../Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm -gPp "obj/sdk/arm64/WebRTC" | cut -f1-2 -d' ' | grep -v U$$; true; } > "obj/sdk/arm64/WebRTC.TOC"; else TOOL_VERSION=1641494109 ../../build/toolchain/apple/linker_driver.py -Wcrl,strippath,../../../../../../../../Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/strip ../../third_party/llvm-build/Release+Asserts/bin/clang++ -shared -all_load -install_name @rpath/WebRTC.framework/WebRTC -Wl,-install_name,@rpath/WebRTC.framework/WebRTC -fuse-ld=lld -Wl,-fatal_warnings -Wl,--icf=all -Wl,--color-diagnostics -target arm64-apple-ios12.0-simulator -no-canonical-prefixes -Werror -Wl,-dead_strip -nostdlib++ -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator15.2.sdk -Wl,-ObjC -Wl,-rpath,@executable_path/Frameworks -Wl,-rpath,@loader_path/Frameworks -L/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator15.2.sdk/usr/lib/swift -L/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/lib/swift/iphonesimulator -o "obj/sdk/arm64/WebRTC" "@obj/sdk/arm64/WebRTC.rsp" && { ../../../../../../../../Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/otool -l "obj/sdk/arm64/WebRTC" | grep LC_ID_DYLIB -A 5; ../../../../../../../../Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm -gPp "obj/sdk/arm64/WebRTC" | cut -f1-2 -d' ' | grep -v U$$; true; } > "obj/sdk/arm64/WebRTC.tmp" && if ! cmp -s "obj/sdk/arm64/WebRTC.tmp" "obj/sdk/arm64/WebRTC.TOC"; then mv "obj/sdk/arm64/WebRTC.tmp" "obj/sdk/arm64/WebRTC.TOC" ; fi; fi
ld64.lld: error: unable to find matching architecture in ../../third_party/llvm-build/Release+Asserts/lib/clang/14.0.0/lib/darwin/libclang_rt.iossim.a
clang++: error: linker command failed with exit code 1 (use -v to see invocation)
Traceback (most recent call last):
File "../../build/toolchain/apple/linker_driver.py", line 291, in <module>
LinkerDriver(sys.argv).run()
File "../../build/toolchain/apple/linker_driver.py", line 116, in run
subprocess.check_call(compiler_driver_args, env=env)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 190, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['../../third_party/llvm-build/Release+Asserts/bin/clang++', '-shared', '-all_load', '-install_name', '@rpath/WebRTC.framework/WebRTC', '-Wl,-install_name,@rpath/WebRTC.framework/WebRTC', '-fuse-ld=lld', '-Wl,-fatal_warnings', '-Wl,--icf=all', '-Wl,--color-diagnostics', '-target', 'arm64-apple-ios12.0-simulator', '-no-canonical-prefixes', '-Werror', '-Wl,-dead_strip', '-nostdlib++', '-isysroot', '/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator15.2.sdk', '-Wl,-ObjC', '-Wl,-rpath,@executable_path/Frameworks', '-Wl,-rpath,@loader_path/Frameworks', '-L/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator15.2.sdk/usr/lib/swift', '-L/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/lib/swift/iphonesimulator', '-o', 'obj/sdk/arm64/WebRTC', '@obj/sdk/arm64/WebRTC.rsp']' returned non-zero exit status 1
ninja: build stopped: subcommand failed.
While the readme illustrates that MacOS Catalyst is supported, it did not work for me.
The latest build has the following comment:
"# Catalyst builds are not working properly yet." (Line 39)
https://github.com/stasel/WebRTC/blob/latest/scripts/build.sh
Before expanding on my issue, can someone please confirm if Mac Catalyst is even supported?
I tried use swift package manager to include the framework, and it turned out that this class (the one in the title) is not in the scope.
and then i tried to drag the precompiled framework that you built in another repository, and remains the same.
Then i went to check the folder of header files, in which i couldn't find "rtcaudiosession.h". Is this a mistake or it is what it supposed to be like?
IOS works fine.
Hi thank you for your effort,
currently, the chromium dashboard milestone hit version M101. Any plan for adding support for those?
Also, is there any plan for adding automation for newer versions?
best
Thanks for your great job!
Could you add support for tvOS?
hi,
I'm using your repository to build a ios webrtc app and connect with freeswitch mod_verto, everything works great (audio and video calls with vp8) except for incoming calls with codec h264 in the invite i got the following error:
"Failed to set remote video description send parameters for m-section with mid='1' "
The received SDP:
v=0
o=FreeSWITCH 1647442427 1647442428 IN IP4 192.168.10.1
s=FreeSWITCH
c=IN IP4 192.168.10.1
t=0 0
a=msid-semantic: WMS L76yjFgAt4mYXs69IVG853RNRIrttj84
m=audio 10070 RTP/SAVPF 0 8 102 9
a=rtpmap:0 PCMU/8000
a=rtpmap:8 PCMA/8000
a=rtpmap:102 opus/48000/2
a=fmtp:102 useinbandfec=1; cbr=1; maxaveragebitrate=30000; maxplaybackrate=48000; ptime=20; minptime=10; maxptime=40
a=rtpmap:9 G722/8000
a=fingerprint:sha-256 35:47:ED:5C:EE:2F:BC:0F:6B:D0:56:DA:86:B8:C7:80:0B:97:02:51:3B:B3:BC:08:26:2D:80:DB:4B:F2:7B:08
a=setup:actpass
a=rtcp-mux
a=rtcp:10070 IN IP4 192.168.10.1
a=ssrc:1049788873 cname:g5tqAzRUu3JimFhV
a=ssrc:1049788873 msid:L76yjFgAt4mYXs69IVG853RNRIrttj84 a0
a=ssrc:1049788873 mslabel:L76yjFgAt4mYXs69IVG853RNRIrttj84
a=ssrc:1049788873 label:L76yjFgAt4mYXs69IVG853RNRIrttj84a0
a=ice-ufrag:qOTlHBZp9gIsqHnw
a=ice-pwd:CGN36uNqomyBRGm8k6jSx97w
a=candidate:3360592939 1 udp 2130706431 192.168.10.1 10070 typ host generation 0
a=candidate:3360592939 2 udp 2130706431 192.168.10.1 10070 typ host generation 0
a=silenceSupp:off - - - -
a=ptime:20
a=sendrecv
m=video 10072 RTP/SAVPF 103
a=rtpmap:103 H264/90000
a=fmtp:103 profile-level-id=640c1f
a=sendrecv
a=fingerprint:sha-256 35:47:ED:5C:EE:2F:BC:0F:6B:D0:56:DA:86:B8:C7:80:0B:97:02:51:3B:B3:BC:08:26:2D:80:DB:4B:F2:7B:08
a=setup:actpass
a=rtcp-mux
a=rtcp:10072 IN IP4 192.168.10.1
a=rtcp-fb:103 ccm fir
a=rtcp-fb:103 ccm tmmbr
a=rtcp-fb:103 nack
a=rtcp-fb:103 nack pli
a=ssrc:226084376 cname:g5tqAzRUu3JimFhV
a=ssrc:226084376 msid:L76yjFgAt4mYXs69IVG853RNRIrttj84 v0
a=ssrc:226084376 mslabel:L76yjFgAt4mYXs69IVG853RNRIrttj84
a=ssrc:226084376 label:L76yjFgAt4mYXs69IVG853RNRIrttj84v0
a=ice-ufrag:K0Dr2dnhdKdIj7oO
a=ice-pwd:cNHf9Vo2PUYA8qJIhc0agdhj
a=candidate:2575442282 1 udp 2130706431 192.168.10.1 10072 typ host generation 0
a=candidate:2575442282 2 udp 2130706430 192.168.10.1 10072 typ host generation 0
a=end-of-candidates
I could' think that h264 was not supported... but for outgoing calls it works ok and i have h264 both ways, the sdp generated in the device is:
v=0
o=- 7773074587810951361 2 IN IP4 127.0.0.1
s=-
t=0 0
a=group:BUNDLE 0 1
a=extmap-allow-mixed
a=msid-semantic: WMS stream
m=audio 9 UDP/TLS/RTP/SAVPF 111 63 103 104 9 102 0 8 106 105 13 110 112 113 126
c=IN IP4 0.0.0.0
a=rtcp:9 IN IP4 0.0.0.0
a=ice-ufrag:BOEf
a=ice-pwd:0vSMJJx/JsQtAQ216aD1DM2Q
a=ice-options:trickle renomination
a=fingerprint:sha-256 70:F1:18:56:E9:CD:09:CA:E6:25:06:98:60:4F:98:B4:0C:A0:A4:07:82:22:73:0E:22:26:1E:8D:15:A3:EC:2E
a=setup:actpass
a=mid:0
a=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level
a=extmap:2 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
a=extmap:3 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01
a=extmap:4 urn:ietf:params:rtp-hdrext:sdes:mid
a=extmap:5 urn:ietf:params:rtp-hdrext:sdes:rtp-stream-id
a=extmap:6 urn:ietf:params:rtp-hdrext:sdes:repaired-rtp-stream-id
a=sendrecv
a=msid:stream audio0
a=rtcp-mux
a=rtpmap:111 opus/48000/2
a=rtcp-fb:111 transport-cc
a=fmtp:111 minptime=10;useinbandfec=1
a=rtpmap:63 red/48000/2
a=fmtp:63 111/111
a=rtpmap:103 ISAC/16000
a=rtpmap:104 ISAC/32000
a=rtpmap:9 G722/8000
a=rtpmap:102 ILBC/8000
a=rtpmap:0 PCMU/8000
a=rtpmap:8 PCMA/8000
a=rtpmap:106 CN/32000
a=rtpmap:105 CN/16000
a=rtpmap:13 CN/8000
a=rtpmap:110 telephone-event/48000
a=rtpmap:112 telephone-event/32000
a=rtpmap:113 telephone-event/16000
a=rtpmap:126 telephone-event/8000
a=ssrc:1128105847 cname:CpXfrIKQ8/xdoTeC
a=ssrc:1128105847 msid:stream audio0
a=ssrc:1128105847 mslabel:stream
a=ssrc:1128105847 label:audio0
m=video 9 UDP/TLS/RTP/SAVPF 96 97 98 99 100 101 127 124 35 36 123 122 125
c=IN IP4 0.0.0.0
a=rtcp:9 IN IP4 0.0.0.0
a=ice-ufrag:BOEf
a=ice-pwd:0vSMJJx/JsQtAQ216aD1DM2Q
a=ice-options:trickle renomination
a=fingerprint:sha-256 70:F1:18:56:E9:CD:09:CA:E6:25:06:98:60:4F:98:B4:0C:A0:A4:07:82:22:73:0E:22:26:1E:8D:15:A3:EC:2E
a=setup:actpass
a=mid:1
a=extmap:14 urn:ietf:params:rtp-hdrext:toffset
a=extmap:2 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
a=extmap:13 urn:3gpp:video-orientation
a=extmap:3 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01
a=extmap:12 http://www.webrtc.org/experiments/rtp-hdrext/playout-delay
a=extmap:11 http://www.webrtc.org/experiments/rtp-hdrext/video-content-type
a=extmap:7 http://www.webrtc.org/experiments/rtp-hdrext/video-timing
a=extmap:8 http://www.webrtc.org/experiments/rtp-hdrext/color-space
a=extmap:4 urn:ietf:params:rtp-hdrext:sdes:mid
a=extmap:5 urn:ietf:params:rtp-hdrext:sdes:rtp-stream-id
a=extmap:6 urn:ietf:params:rtp-hdrext:sdes:repaired-rtp-stream-id
a=sendrecv
a=msid:stream video0
a=rtcp-mux
a=rtcp-rsize
a=rtpmap:96 H264/90000
a=rtcp-fb:96 goog-remb
a=rtcp-fb:96 transport-cc
a=rtcp-fb:96 ccm fir
a=rtcp-fb:96 nack
a=rtcp-fb:96 nack pli
a=fmtp:96 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=640c1f
a=rtpmap:97 rtx/90000
a=fmtp:97 apt=96
a=rtpmap:98 H264/90000
a=rtcp-fb:98 goog-remb
a=rtcp-fb:98 transport-cc
a=rtcp-fb:98 ccm fir
a=rtcp-fb:98 nack
a=rtcp-fb:98 nack pli
a=fmtp:98 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=42e01f
a=rtpmap:99 rtx/90000
a=fmtp:99 apt=98
a=rtpmap:100 VP8/90000
a=rtcp-fb:100 goog-remb
a=rtcp-fb:100 transport-cc
a=rtcp-fb:100 ccm fir
a=rtcp-fb:100 nack
a=rtcp-fb:100 nack pli
a=rtpmap:101 rtx/90000
a=fmtp:101 apt=100
a=rtpmap:127 VP9/90000
a=rtcp-fb:127 goog-remb
a=rtcp-fb:127 transport-cc
a=rtcp-fb:127 ccm fir
a=rtcp-fb:127 nack
a=rtcp-fb:127 nack pli
a=rtpmap:124 rtx/90000
a=fmtp:124 apt=127
a=rtpmap:35 AV1/90000
a=rtcp-fb:35 goog-remb
a=rtcp-fb:35 transport-cc
a=rtcp-fb:35 ccm fir
a=rtcp-fb:35 nack
a=rtcp-fb:35 nack pli
a=rtpmap:36 rtx/90000
a=fmtp:36 apt=35
a=rtpmap:123 red/90000
a=rtpmap:122 rtx/90000
a=fmtp:122 apt=123
a=rtpmap:125 ulpfec/90000
a=ssrc-group:FID 510134134 3309272499
a=ssrc:510134134 cname:CpXfrIKQ8/xdoTeC
a=ssrc:510134134 msid:stream video0
a=ssrc:510134134 mslabel:stream
a=ssrc:510134134 label:video0
a=ssrc:3309272499 cname:CpXfrIKQ8/xdoTeC
a=ssrc:3309272499 msid:stream video0
a=ssrc:3309272499 mslabel:stream
a=ssrc:3309272499 label:video0
Do you know if there is any option to process the incoming sdp?
Edited:
i checked in the webrtcClient the supported for video codecs:
let defaultRTC = RTCDefaultVideoDecoderFactory()
let defaultEncodeRTC = RTCDefaultVideoEncoderFactory()
for codec in defaultRTC.supportedCodecs() {
print("defaultRTC -> \(codec.name) => \(codec.parameters)")
}
for codec in defaultEncodeRTC.supportedCodecs() {
print("defaultEncodeRTC -> \(codec.name) => \(codec.parameters)")
}
and h264 is supported:
defaultRTC -> H264 => ["packetization-mode": "1", "profile-level-id": "640c1f", "level-asymmetry-allowed": "1"]
defaultRTC -> H264 => ["packetization-mode": "1", "profile-level-id": "42e01f", "level-asymmetry-allowed": "1"]
defaultRTC -> VP8 => [:]
defaultRTC -> VP9 => [:]
defaultRTC -> AV1 => [:]
defaultEncodeRTC -> H264 => ["packetization-mode": "1", "profile-level-id": "640c1f", "level-asymmetry-allowed": "1"]
defaultEncodeRTC -> H264 => ["packetization-mode": "1", "profile-level-id": "42e01f", "level-asymmetry-allowed": "1"]
defaultEncodeRTC -> VP8 => [:]
defaultEncodeRTC -> VP9 => [:]
defaultEncodeRTC -> AV1 => [:]
Thanks,
António
Could not resolve package graph. Cannot continue.
failed validating archive from 'https://github.com/stasel/WebRTC/releases/download/114.0.0/WebRTC-M114.xcframework.zip' which is required by binary target 'WebRTC': could not find executable for 'unzip'
Is Swift Playgrounds possible in the future?
For adaptive streaming many encode has default setting veryfast.
Is this the case here and if so then how to change it
Hi @stasel
I added your framework as a dependency of my app using Xcode & SPM.
I'm using the latest 103 version and it works fine on iOS.
But when I tried to add support for Mac Catalyst for my app & run on my Mac using Mac Catalyst build, I get the following error :
Do you have any idea about why I get this error ? 🤔
I tried to download the .xcframework to add it manually instead of using SPM to be able to switch to embed & sign as mentioned in another issue on this repo, but it gave me the same error.
I also tried to get the .framework file for Mac Catalyst inside the .xcframework to add it manually, but no success here either.
If it doesn't work, do you know if there is any way to tell Xcode that I'd like to use the macOS framework when building against Catalyst ? 🤔
Hi, I m using latest version of webrtc. I successfully make and receive calls but while Im in call I try to send dtmf to other peer but its not working. My other peer is Janus gateway. I have same app on android and it works fine but on ios side, its not working.
this is my handle class that handles webrtc events.
import Foundation
import WebRTC
class JanusPluginHandle{
var description: String!
var started:Bool = false
var myStream:RTCMediaStream!
var remoteStream:RTCMediaStream!
var mySdp:RTCSessionDescription!
var localSdp:RTCSessionDescription!
var pc:RTCPeerConnection!
var dataChannel:RTCDataChannel!
var trickle:Bool = true
var iceDone:Bool = false
var sdpSent:Bool = false
let VIDEO_TRACK_ID:String = "1929283"
let AUDIO_TRACK_ID:String = "1928882"
let LOCAL_MEDIA_ID:String = "1198181"
var sessionFactory:RTCPeerConnectionFactory!
let server:JanusServer!
let plugin:JanusSupportedPluginPackages!
var id:Int64!
let callbacks:JanusPluginCallbacksDelegate!
var isOffer:Bool = false
let AUDIO_CODEC_PARAM_BITRATE:String = "maxaveragebitrate"
let AUDIO_ECHO_CANCELLATION_CONSTRAINT:String = "googEchoCancellation"
let AUDIO_AUTO_GAIN_CONTROL_CONSTRAINT:String = "googAutoGainControl"
let AUDIO_HIGH_PASS_FILTER_CONSTRAINT:String = "googHighpassFilter"
let AUDIO_NOISE_SUPPRESSION_CONSTRAINT:String = "googNoiseSuppression"
let DTLS_SRTP_KEY_AGREEMENT_CONSTRAINT:String = "DtlsSrtpKeyAgreement"
var audioConstraints:RTCMediaConstraints!
var sdpMediaConstraints:RTCMediaConstraints!
var audioSource:RTCAudioSource!
var localAudioTrack:RTCAudioTrack!
init(server:JanusServer, plugin:JanusSupportedPluginPackages,handle_id:Int64, callbacks:JanusPluginCallbacksDelegate) {
self.server = server
self.plugin = plugin
self.id = handle_id
self.callbacks = callbacks
createMediaConstraintsInternal()
sessionFactory = RTCPeerConnectionFactory()
}
private func createMediaConstraintsInternal() {
let mandatory:[String:String] = [
AUDIO_ECHO_CANCELLATION_CONSTRAINT:"false",
AUDIO_AUTO_GAIN_CONTROL_CONSTRAINT: "false",
AUDIO_HIGH_PASS_FILTER_CONSTRAINT: "false",
AUDIO_NOISE_SUPPRESSION_CONSTRAINT: "false"
]
audioConstraints = RTCMediaConstraints.init(mandatoryConstraints: mandatory, optionalConstraints: nil)
let sdpMandatory:[String:String] = [
"OfferToReceiveAudio": "true",
"OfferToReceiveVideo": "false"
]
let sdpOptinal:[String:String] = [
"DtlsSrtpKeyAgreement":"true"
]
sdpMediaConstraints = RTCMediaConstraints.init(mandatoryConstraints: sdpMandatory, optionalConstraints: sdpOptinal)
}
public func onAttached(obj:[String:Any]) {
callbacks.onAttached(obj: obj)
}
public func onMessage(msg:String) {
if let json = msg.toJson(){
callbacks.onMessage(msg: json, jsep: nil)
}
}
/**
* WEBRTC den gelen EVENT mesajları mesajlar bunlar.
* @param msg
* @param jsep
*/
public func onMessage(msg:[String:Any], jsep:[String:Any]?) {
callbacks.onMessage(msg: msg, jsep: jsep)
}
private func onLocalStream(stream:RTCMediaStream) {
stream.audioTracks.first?.isEnabled = true
localAudioTrack = stream.audioTracks.first
callbacks.onLocalStream(stream: stream)
}
private func onRemoteStream(stream:RTCMediaStream) {
callbacks.onRemoteStream(stream: stream)
}
public func onDataOpen(data:Data) {
callbacks.onDataOpen(data: data)
}
public func onData(data:Data) {
callbacks.onData(data: data)
}
public func onCleanup() {
callbacks.onCleanup()
}
public func onDetached() {
callbacks.onDetached();
}
public func onMedia() {
callbacks.onMedia();
}
public func sendMessage(obj:PluginHandleSendMessageCallbacksDelegate) {
server.sendMessage(type: TransactionType.plugin_handle_message, handle: id, callbacks: obj, plugin: plugin)
}
public func createOffer( webrtcCallbacks:PluginHandleWebRTCCallbacksDelegate) {
isOffer = true;
prepareWebRtc(callbacks: webrtcCallbacks);
}
public func createAnswer( webrtcCallbacks:PluginHandleWebRTCCallbacksDelegate) {
isOffer = false
prepareWebRtc(callbacks: webrtcCallbacks)
}
private func prepareWebRtc( callbacks:PluginHandleWebRTCCallbacksDelegate) {
if (pc != nil) {
if (callbacks.getJsep() == nil) {
createSdpInternal(callbacks: callbacks, isOffer: isOffer)
} else {
let jsep = callbacks.getJsep()!
let sdpString:String = jsep["sdp"] as! String
let type:RTCSdpType = RTCSessionDescription.type(for: jsep["type"] as! String)
let sdp:RTCSessionDescription = RTCSessionDescription.init(type: type, sdp: sdpString)
pc.setRemoteDescription(sdp) { (err) in}
}
} else {
trickle = callbacks.getTrickle() != nil ? callbacks.getTrickle()! : false
streamsDone(webRTCCallbacks: callbacks)
}
}
private func streamsDone(webRTCCallbacks:PluginHandleWebRTCCallbacksDelegate) {
let rtcConfig = RTCConfiguration.init()
rtcConfig.iceServers = server.iceServers
rtcConfig.bundlePolicy = RTCBundlePolicy.maxBundle
rtcConfig.rtcpMuxPolicy = RTCRtcpMuxPolicy.require
rtcConfig.continualGatheringPolicy = RTCContinualGatheringPolicy.gatherContinually
// rtcConfig.sdpSemantics = .planB
let source :RTCAudioSource = sessionFactory.audioSource(with: audioConstraints)
let audioTrack:RTCAudioTrack? = sessionFactory.audioTrack(with: source, trackId: AUDIO_TRACK_ID)
let stream:RTCMediaStream? = sessionFactory.mediaStream(withStreamId: LOCAL_MEDIA_ID)
if (audioTrack != nil){
stream!.addAudioTrack(audioTrack!)
myStream = stream
}
if (stream != nil){
onLocalStream(stream: stream!)
}
pc = sessionFactory.peerConnection(with: rtcConfig, constraints: audioConstraints, delegate: nil)
if (myStream != nil){
pc.add(myStream)
}
if let obj:[String:Any] = webRTCCallbacks.getJsep(){
let sdp:String = obj["sdp"] as! String
let type:RTCSdpType = RTCSessionDescription.type(for: obj["type"] as! String)
let sessionDescription:RTCSessionDescription = RTCSessionDescription(type: type, sdp: sdp)
print(" STREAMS DONE JSEP NULL DEĞİL")
pc.setRemoteDescription(sessionDescription) { (err) in
}
}else{
createSdpInternal(callbacks: webRTCCallbacks, isOffer: isOffer)
print(" STREAMS DONE JSEP NULL ");
}
}
private func createSdpInternal(callbacks:PluginHandleWebRTCCallbacksDelegate,isOffer:Bool) {
if (isOffer) {
print(" CREEATE SDP OFFER ");
// pc.createOffer(WebRtcObserver(callbacks), sdpMediaConstraints);
pc.offer(for: sdpMediaConstraints) { [self] (sdp, err) in
if(err == nil){
self.pc.setLocalDescription(sdp!) { err in
if(err == nil){
var obj:[String:Any] = [:]
obj["type"] = "offer"
obj["sdp"] = sdp?.sdp
callbacks.onSuccess(obj: obj)
}
}
}
}
} else {
print(" CREEATE SDP ANSWER ");
pc.answer(for: sdpMediaConstraints) { (sdp, err) in
if(err == nil){
self.pc.setLocalDescription(sdp!) { err in
if(err == nil){
var obj:[String:Any] = [:]
obj["type"] = "answer"
obj["sdp"] = sdp?.sdp
callbacks.onSuccess(obj: obj)
}
}
}
}
}
}
public func muteMicrophone(_ mute:Bool){
}
public func handleRemoteJsep( webrtcCallbacks:PluginHandleWebRTCCallbacksDelegate) {
if (sessionFactory == nil) {
webrtcCallbacks.onCallbackError(error: "WebRtc PeerFactory is not initialized. Please call initializeMediaContext")
}
if let jsep = webrtcCallbacks.getJsep() {
if (pc == nil) {
callbacks.onCallbackError(error: "No peerconnection created, if this is an answer please use createAnswer")
}
let sdpString = jsep["sdp"] as? String
let type:RTCSdpType = jsep["type"] as? String == "answer" ? .answer : .offer
let sdp = RTCSessionDescription(type: type, sdp: sdpString!)
pc.setRemoteDescription(sdp) { err in
print(err)
}
print("HANDLE REMOTE JSP");
}
}
private func prepareRemoteSDP( webrtcCallbacks:PluginHandleWebRTCCallbacksDelegate,jsep:[String:Any]){
let sdpString:String = jsep["sdp"] as! String
let type:RTCSdpType = RTCSessionDescription.type(for:jsep["type"] as! String)
let sdp:RTCSessionDescription = RTCSessionDescription.init(type: type, sdp: sdpString)
pc.setRemoteDescription(sdp) { (err) in
if(err == nil){
//
}
}
}
public func hangUp() {
if (remoteStream != nil) {
remoteStream = nil
}
if (myStream != nil) {
myStream = nil
}
if (pc != nil && pc.signalingState != RTCSignalingState.closed){
pc.close()
}
pc = nil
started = false
localSdp = nil
if (dataChannel != nil){
dataChannel.close()
}
dataChannel = nil
trickle = true
iceDone = false
sdpSent = false
isOffer = false
callbacks.onHangup();
}
public func disconnect(){
}
public func detach() {
hangUp();
let obj:[String:Any] = [:]
server.sendMessage(msg: obj, type: JanusMessageType.detach, handle: id)
}
private func onLocalSdp( sdp:RTCSessionDescription, callbacks:PluginHandleWebRTCCallbacksDelegate) {
print(" ONLOCAL SDP ");
if (pc != nil) {
if (localSdp == nil) {
localSdp = sdp
pc.setLocalDescription(sdp) { (err) in
if(err == nil){
//
}
}
}
if (!iceDone && !trickle){
print(" ONLOCAL SDP ice ve trickle ");
return;
}
print(" ONLOCAL SDP sdp success");
sdpSent = true;
var obj:[String:Any] = [:]
obj["sdp"] = localSdp.description
obj["type"] = localSdp.type.rawValue
callbacks.onSuccess(obj: obj)
}else{
print(" ONLOCAL PREPARE WEBRTC BU NU BEN EKLEDIMDI ");
prepareWebRtc(callbacks: callbacks);
}
}
private func sendTrickleCandidate(candidate:RTCIceCandidate?) {
var message:[String:Any] = [:]
var cand:[String:Any] = [:]
if (candidate == nil){
cand["completed"] = true
}else {
cand["candidate"] = candidate?.sdp
cand["sdpMid"] = candidate?.sdpMid
cand["sdpMLineIndex"] = candidate?.sdpMLineIndex
}
message["candidate"] = cand
server.sendMessage(msg: message, type: JanusMessageType.trickle, handle: id)
}
private func sendSdp( callbacks:PluginHandleWebRTCCallbacksDelegate) {
if (localSdp != nil) {
localSdp = pc.localDescription
if (!sdpSent) {
sdpSent = true;
var obj:[String:Any] = [:]
obj["sdp"] = localSdp.description
obj["type"] = localSdp.type.rawValue
callbacks.onSuccess(obj:obj)
}
}
}
public func insertDTMF(_ tone:String){
if(pc != nil){
if let dtmfSender = pc.senders.first?.dtmfSender{
print(dtmfSender.canInsertDtmf)
dtmfSender.insertDtmf(tone, duration: 200, interToneGap: 70)
}
}
}
}
my app uses ios side of this packege for call events
https://github.com/jonataslaw/flutter-incall-manager/tree/master/ios
and callkit for backround handling.
any idea will be appriciated.
Hi! So this week I tried to wrap up custom audio engine and audio unit (AVAudioEngine and AVAudioUnit) to be compatible with LiveKit situation and unfortunately, the single one file in this wrapper, called RTCAudioDevice.h cause a conflict with legacy WebRTC (Specs for Swift language to be specific) and does not allow to use this repo with LiveKit situation. Also it turned out that rebuilding framework with custom name is so slow and is too much of a headache then tailoring RTCAudioDevice and corresponding ObjC file in WebRTC. Please check on this issue and if that wouldn't be much of an issue and won't break multiple repos that uses this one, it might be worth it changing naming or at least give moduleAlias in the package
i want implement to peerconnection with audio only. But the audio it work after start capture video.
Does anyone know how to do this?
Can support be added for setting enable_dsyms=true
via environment variable and also adding the generated dSYMs to the generated frameworks?
Hello,
First of all, thank you for this distribution and work you've done.
According to Xcode 14 Beta release notes, Bitcode soon will be deprecated and App Store will no longer accept such submissions:
So, I believe that there isn't any need in Bitcode support in the near future
Since I updated my devices to iOS 17, I've had problems viewing video over WiFi via a STUN server. The video connection works well over a cell connection and TURN server, but the connection often drops after a few seconds over WiFi and STUN. I'm having trouble isolating the problem, and I can't reproduce it with the example app, but I'd be interested to hear if anyone else is experiencing this issue, since I've seen a few similar reports about networking issues related to WebRTC video and iOS 17.
ld: Undefined symbols:
rtc::GetIntFromJsonObject(Json::Value const&, std::__1::basic_string_view<char, std::__1::char_traits>, int*), referenced from:
Conductor::OnMessageFromPeer(int, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&) in conductor.cc.o
rtc::GetStringFromJsonObject(Json::Value const&, std::__1::basic_string_view<char, std::__1::char_traits>, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>*), referenced from:
Conductor::OnMessageFromPeer(int, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&) in conductor.cc.o
Conductor::OnMessageFromPeer(int, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&) in conductor.cc.o
Conductor::OnMessageFromPeer(int, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&) in conductor.cc.o
Conductor::OnMessageFromPeer(int, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&) in conductor.cc.o
webrtc::test::VcmCapturer::Create(unsigned long, unsigned long, unsigned long, unsigned long), referenced from:
(anonymous namespace)::CapturerTrackSource::Create() in conductor.cc.o
is_debug = true
target_os = "mac"
target_cpu = "arm64"
is_clang = true
treat_warnings_as_errors = false
rtc_include_tests = false
rtc_use_h264 = true
is_component_build = false
use_custom_libcxx = false
rtc_enable_protobuf = false
use_rtti = true
proprietary_codecs = true
ffmpeg_branding = "Chrome"
use_custom_libcxx_for_host = false
use M117 or M118,project will call the error
ld: warning: Could not find or use auto-linked framework 'AudioUnit': framework 'AudioUnit' not found
ld: warning: Could not find or use auto-linked framework 'CoreAudioTypes': framework 'CoreAudioTypes' not found
ld: Undefined symbols:
OBJC_CLASS$_RTCEAGLVideoView, referenced from:
in SipCoreCenter.o
clang: error: linker command failed with exit code 1 (use -v to see invocation)
👋
I wonder if it would be possible to add arm64 cpu for Mac target to be able to use webrtc in simulators on m1 macs?
Thanks!
hey there, is it possible to modify the build configs to expose VAD to swift? from eg modules/audio_processing/vad/standalone_vad.h
.
Do you have some idea add privacy manifest to webrtc framework?
WebRTC framework which support audio buffer from Broadcast Upload Extension
The problem and solution
A broadcast extension allows to broadcast your screen from iOS. However, an extension is an environment restricted compared to normal applications, and lacks the Audio Unit framework, which is required by the audio device module for iOS of WebRTC.
A broadcast extension can still receive sounds of applications and one recorded with microphone via RPBroadcastSampleHandler object. This fork fixes the audio device module on environments without the Audio Unit framework and adds interfaces to deliver data received via RPBroadcastSampleHandler: https://github.com/pixiv/webrtc/releases/tag/87.0.4280.142-pixiv0
Is there a way that @stasel WebRTC package include this feature from the fork https://github.com/pixiv/webrtc/releases/tag/87.0.4280.142-pixiv0?
Hello @stasel, did you see this issue I opened in your other repo? stasel/WebRTC-iOS#142
Hi I am trying to use your builds of WebRTC for iOS but everyone of them crashes with the same issue, I have install WebRTC from other GitHub repos and they work fine, but they have bitcode and dont seem to be maintained as much, still on M95, any help appreciated
Hello again,
I believe this is not related to this repository but just for reference, I'm opening this issue. I was using version 96 in my application. There was no problem at all. After updating SPM to version 102, I started to receive this issue before loading the application. I try to use it without SPM, just tried to download it from a zip file from URL https://github.com/stasel/WebRTC/releases/download/102.0.0/WebRTC-M102.xcframework.zip. The result was the same.
I don't have time to go deeper, so that's why I go back to version 96.
Here is the full error message on Xcode
WebRTC
___lldb_unnamed_symbol12674$$WebRTC:
0x1070d5df8 <+0>: stp x20, x19, [sp, #-0x20]!
0x1070d5dfc <+4>: stp x29, x30, [sp, #0x10]
0x1070d5e00 <+8>: add x29, sp, #0x10
0x1070d5e04 <+12>: bl 0x1075bf71c ; symbol stub for: objc_autoreleasePoolPush
0x1070d5e08 <+16>: mov x19, x0
0x1070d5e0c <+20>: adrp x8, 1646
0x1070d5e10 <+24>: ldr x0, [x8, #0x418]
0x1070d5e14 <+28>: adrp x8, 1645
0x1070d5e18 <+32>: ldr x1, [x8, #0x28]
0x1070d5e1c <+36>: adrp x2, 1289
0x1070d5e20 <+40>: add x2, x2, #0xd7d ; kRTCAudioSessionLowComplexityIOBufferDuration + 86893
0x1070d5e24 <+44>: bl 0x1075bf7ac ; symbol stub for: objc_msgSend
0x1070d5e28 <+48>: mov x29, x29
0x1070d5e2c <+52>: bl 0x1075bf7f4 ; symbol stub for: objc_retainAutoreleasedReturnValue
0x1070d5e30 <+56>: adrp x8, 1736
0x1070d5e34 <+60>: str x0, [x8, #0x620]
0x1070d5e38 <+64>: mov w0, #0x3
0x1070d5e3c <+68>: bl 0x1070d5d24 ; ___lldb_unnamed_symbol12671$$WebRTC
0x1070d5e40 <+72>: mov x29, x29
0x1070d5e44 <+76>: bl 0x1075bf7f4 ; symbol stub for: objc_retainAutoreleasedReturnValue
0x1070d5e48 <+80>: adrp x8, 1622
0x1070d5e4c <+84>: add x8, x8, #0x820 ; @"640c1f"
0x1070d5e50 <+88>: cmp x0, #0x0
0x1070d5e54 <+92>: csel x8, x8, x0, eq
0x1070d5e58 <+96>: adrp x9, 1736
0x1070d5e5c <+100>: str x8, [x9, #0x628]
0x1070d5e60 <+104>: mov w0, #0x0
0x1070d5e64 <+108>: bl 0x1070d5d24 ; ___lldb_unnamed_symbol12671$$WebRTC
0x1070d5e68 <+112>: mov x29, x29
0x1070d5e6c <+116>: bl 0x1075bf7f4 ; symbol stub for: objc_retainAutoreleasedReturnValue
0x1070d5e70 <+120>: adrp x8, 1622
0x1070d5e74 <+124>: add x8, x8, #0x840 ; @"42e01f"
0x1070d5e78 <+128>: cmp x0, #0x0
0x1070d5e7c <+132>: csel x8, x8, x0, eq
0x1070d5e80 <+136>: nop
-> 0x1070d5e84 <+140>: str x8, [x9, #0x630]
0x1070d5e88 <+144>: mov x0, x19
0x1070d5e8c <+148>: ldp x29, x30, [sp, #0x10]
0x1070d5e90 <+152>: ldp x20, x19, [sp], #0x20
0x1070d5e94 <+156>: b 0x1075bf710 ; symbol stub for: objc_autoreleasePoolPop
`
0x106be9adc <+0>: pacibsp
0x106be9ae0 <+4>: sub sp, sp, #0x150
0x106be9ae4 <+8>: stp x22, x21, [sp, #0x120]
0x106be9ae8 <+12>: stp x20, x19, [sp, #0x130]
0x106be9aec <+16>: stp x29, x30, [sp, #0x140]
0x106be9af0 <+20>: add x29, sp, #0x140
0x106be9af4 <+24>: mov x20, x0
0x106be9af8 <+28>: stp xzr, xzr, [sp]
0x106be9afc <+32>: mov w0, #0x34
0x106be9b00 <+36>: movk w0, #0x1f07, lsl #16
0x106be9b04 <+40>: mov x1, #0x0
0x106be9b08 <+44>: mov x2, #0x0
0x106be9b0c <+48>: mov x3, #0x0
0x106be9b10 <+52>: mov x4, #0x0
0x106be9b14 <+56>: mov x5, #0x0
0x106be9b18 <+60>: mov x6, #0x0
0x106be9b1c <+64>: bl 0x106bd3614 ; dyld3::kdebug_trace_dyld_marker(unsigned int, dyld3::kt_arg, dyld3::kt_arg, dyld3::kt_arg, dyld3::kt_arg)
0x106be9b20 <+68>: adrp x19, -25
0x106be9b24 <+72>: add x19, x19, #0x0
0x106be9b28 <+76>: mov x0, x19
0x106be9b2c <+80>: bl 0x106bd3200 ; dyld3::MachOFile::hasChainedFixups() const
0x106be9b30 <+84>: tbz w0, #0x0, 0x106be9d10 ; <+564>
0x106be9b34 <+88>: sub x21, x29, #0x50
0x106be9b38 <+92>: stp xzr, x21, [x29, #-0x50]
0x106be9b3c <+96>: adrp x8, 53
0x106be9b40 <+100>: ldr d0, [x8, #0x418]
0x106be9b44 <+104>: stur d0, [x29, #-0x40]
0x106be9b48 <+108>: add x8, x21, #0x18
0x106be9b4c <+112>: adrp x16, 1
0x106be9b50 <+116>: add x16, x16, #0xaf4 ; __Block_byref_object_copy_
0x106be9b54 <+120>: pacia x16, x8
0x106be9b58 <+124>: stur x16, [x29, #-0x38]
0x106be9b5c <+128>: add x8, x21, #0x20
0x106be9b60 <+132>: adrp x16, 1
0x106be9b64 <+136>: add x16, x16, #0xb00 ; __Block_byref_object_dispose_
0x106be9b68 <+140>: pacia x16, x8
0x106be9b6c <+144>: stur x16, [x29, #-0x30]
0x106be9b70 <+148>: add x0, x21, #0x28
0x106be9b74 <+152>: mov w1, #0x0
0x106be9b78 <+156>: bl 0x106bd19b0 ; Diagnostics::Diagnostics(bool)
0x106be9b7c <+160>: ldur x8, [x29, #-0x48]
0x106be9b80 <+164>: add x1, x8, #0x28
0x106be9b84 <+168>: sub x8, x29, #0x88
0x106be9b88 <+172>: adrp x16, 83
0x106be9b8c <+176>: add x16, x16, #0x310 ; _NSConcreteStackBlock
0x106be9b90 <+180>: mov x17, x8
0x106be9b94 <+184>: movk x17, #0x6ae1, lsl #48
0x106be9b98 <+188>: pacda x16, x17
0x106be9b9c <+192>: stur x16, [x29, #-0x88]
0x106be9ba0 <+196>: adrp x9, 53
0x106be9ba4 <+200>: ldr d0, [x9, #0x420]
0x106be9ba8 <+204>: stur d0, [x29, #-0x80]
0x106be9bac <+208>: add x8, x8, #0x10
0x106be9bb0 <+212>: adrp x16, 1
0x106be9bb4 <+216>: add x16, x16, #0xb08 ; __�start_block_invoke
0x106be9bb8 <+220>: pacia x16, x8
0x106be9bbc <+224>: adrp x8, 63
0x106be9bc0 <+228>: add x8, x8, #0x1d0 ; __block_descriptor_tmp
0x106be9bc4 <+232>: stp x16, x8, [x29, #-0x78]
0x106be9bc8 <+236>: stp x21, x19, [x29, #-0x68]
0x106be9bcc <+240>: stur x19, [x29, #-0x58]
0x106be9bd0 <+244>: sub x3, x29, #0x88
0x106be9bd4 <+248>: mov x0, x19
0x106be9bd8 <+252>: mov x2, #0x0
0x106be9bdc <+256>: bl 0x106bd32b4 ; dyld3::MachOAnalyzer::withChainStarts(Diagnostics&, unsigned long long, void (dyld_chained_starts_in_image const*) block_pointer) const
0x106be9be0 <+260>: ldur x8, [x29, #-0x48]
0x106be9be4 <+264>: add x0, x8, #0x28
0x106be9be8 <+268>: bl 0x106bd7220 ; Diagnostics::assertNoError() const
0x106be9bec <+272>: bl 0x106be2990 ; mach_init
0x106be9bf0 <+276>: mov x0, x20
0x106be9bf4 <+280>: bl 0x106bec074 ; dyld4::KernelArgs::findApple() const
0x106be9bf8 <+284>: bl 0x106c052d4 ; __guard_setup
0x106be9bfc <+288>: mov x0, x20
0x106be9c00 <+292>: bl 0x106bec074 ; dyld4::KernelArgs::findApple() const
0x106be9c04 <+296>: bl 0x106c12604 ; _subsystem_init
0x106be9c08 <+300>: adrp x21, 63
0x106be9c0c <+304>: add x21, x21, #0x70 ; dyld4::sConfigBuffer
0x106be9c10 <+308>: adrp x2, 83
0x106be9c14 <+312>: add x2, x2, #0x300 ; dyld4::sSyscallDelegate
0x106be9c18 <+316>: mov x0, x21
0x106be9c1c <+320>: mov x1, x20
0x106be9c20 <+324>: bl 0x106bec130 ; dyld4::ProcessConfig::ProcessConfig(dyld4::KernelArgs const*, dyld4::SyscallDelegate&)
0x106be9c24 <+328>: add x8, sp, #0x90
0x106be9c28 <+332>: adrp x16, 83
0x106be9c2c <+336>: add x16, x16, #0x310 ; _NSConcreteStackBlock
0x106be9c30 <+340>: mov x17, x8
0x106be9c34 <+344>: movk x17, #0x6ae1, lsl #48
0x106be9c38 <+348>: pacda x16, x17
0x106be9c3c <+352>: str x16, [sp, #0x90]
0x106be9c40 <+356>: adrp x9, 53
0x106be9c44 <+360>: ldr d0, [x9, #0x428]
0x106be9c48 <+364>: str d0, [sp, #0x98]
0x106be9c4c <+368>: add x8, x8, #0x10
0x106be9c50 <+372>: adrp x16, 1
0x106be9c54 <+376>: add x16, x16, #0xb50 ; __�start_block_invoke.3
0x106be9c58 <+380>: pacia x16, x8
0x106be9c5c <+384>: adrp x8, 63
0x106be9c60 <+388>: add x8, x8, #0x200 ; __block_descriptor_tmp.5
0x106be9c64 <+392>: stp x16, x8, [sp, #0xa0]
0x106be9c68 <+396>: str x19, [sp, #0xb0]
0x106be9c6c <+400>: add x1, sp, #0x90
0x106be9c70 <+404>: mov x0, x19
0x106be9c74 <+408>: bl 0x106bd9ea0 ; dyld3::MachOFile::forEachSegment(void (dyld3::MachOFile::SegmentInfo const&, bool&) block_pointer) const
0x106be9c78 <+412>: movi.2d v0, #0000000000000000
0x106be9c7c <+416>: stp q0, q0, [sp, #0x10]
0x106be9c80 <+420>: adrp x8, 66
0x106be9c84 <+424>: add x8, x8, #0xbb8 ; _os_lock_type_unfair
0x106be9c88 <+428>: str x8, [sp, #0x30]
0x106be9c8c <+432>: adrp x8, 53
0x106be9c90 <+436>: ldr q1, [x8, #0x440]
0x106be9c94 <+440>: stur q1, [sp, #0x38]
0x106be9c98 <+444>: stur q0, [sp, #0x48]
0x106be9c9c <+448>: stur q0, [sp, #0x58]
0x106be9ca0 <+452>: stur q0, [sp, #0x68]
0x106be9ca4 <+456>: str xzr, [sp, #0x78]
0x106be9ca8 <+460>: mov w8, #0x1
0x106be9cac <+464>: str w8, [sp, #0x80]
0x106be9cb0 <+468>: add x1, sp, #0x10
0x106be9cb4 <+472>: mov x0, x21
0x106be9cb8 <+476>: bl 0x106be0878 ; dyld4::APIs::bootstrap(dyld4::ProcessConfig const&, dyld4::RuntimeLocks&)
0x106be9cbc <+480>: mov x19, x0
0x106be9cc0 <+484>: bl 0x106beab78 ; dyld4::prepare(dyld4::APIs&, dyld3::MachOAnalyzer const*)
Thread 1: EXC_BAD_ACCESS (code=1, address=0x631)
0x106be9cc4 <+488>: mov x20, x0
0x106be9cc8 <+492>: mov x0, x19
0x106be9ccc <+496>: bl 0x106be02c4 ; dyld4::RuntimeState::decWritable()
0x106be9cd0 <+500>: ldr x8, [x19, #0x8]
0x106be9cd4 <+504>: ldr w0, [x8, #0x40]
0x106be9cd8 <+508>: ldp x1, x2, [x8, #0x48]
0x106be9cdc <+512>: ldr x3, [x8, #0x58]
0x106be9ce0 <+516>: blraaz x20
0x106be9ce4 <+520>: mov x1, x0
0x106be9ce8 <+524>: ldr x0, [x19, #0x68]
0x106be9cec <+528>: ldr x16, [x0]
0x106be9cf0 <+532>: mov x17, x0
0x106be9cf4 <+536>: movk x17, #0x9abf, lsl #48
0x106be9cf8 <+540>: autda x16, x17
0x106be9cfc <+544>: ldr x8, [x16, #0x78]!
0x106be9d00 <+548>: mov x9, x16
0x106be9d04 <+552>: mov x17, x9
0x106be9d08 <+556>: movk x17, #0x3448, lsl #48
0x106be9d0c <+560>: blraa x8, x17
0x106be9d10 <+564>: adrp x0, 54
0x106be9d14 <+568>: add x0, x0, #0x320 ; "start"
0x106be9d18 <+572>: adrp x1, 54
0x106be9d1c <+576>: add x1, x1, #0x326 ; "dyldMain.cpp"
0x106be9d20 <+580>: adrp x3, 54
0x106be9d24 <+584>: add x3, x3, #0x333 ; "dyldMA->hasChainedFixups()"
0x106be9d28 <+588>: mov w2, #0x33a
0x106be9d2c <+592>: bl 0x106c1d85c ; __assert_rtn
`
Hi,
I tried to release a new version of the app I'm working on with the last build you made this week-end using M107 of WebRTC and fixing macOS Catalyst build following an issue I opened a few months ago.
Sadly, this build does not pass Apple's review with the following error:
ITMS-90338: Non-public API usage - The app references non-public selectors in Frameworks/WebRTC.framework/WebRTC: terminate. If method names in your source code match the private Apple APIs listed above, altering your method names will help prevent this app from being flagged in future submissions. In addition, note that one or more of the above APIs may be located in a static library that was included with your app. If so, they must be removed. For further information, visit the Technical Support Information at http://developer.apple.com/support/technical/
Hope this'll help :)
Hey, can you bundle the macOS version with the release too?
What is required to get this running with cocoapods? I would love to get it into my app but sadly need it as a pod.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.