WebRTC: Support FS MCU connects to SRS by WHIP for SIP clients. #3625
Replies: 3 comments 1 reply
-
FS only support unbundle audio and video, which means we should transport audio and video on dedicated port, for example, send audio on 8000/UDP and video on 8001/UDP port. |
Beta Was this translation helpful? Give feedback.
-
Thanks @winlinvip . Here's some updates: The original plan was to make it work in XSwitch and then port to open source FreeSWITCH. XSwitch is a commercial product implemented on top of FreeSWITCH. FreeSWITCH is an open source soft switch and MCU. The idea to implement it in XSwitch first is because XSwitch has a built-in web UI (names XUI), and to support that, a lot of patches and goodies was added, such as an embeded web server, and it also powers the WHIP and WHEP APIs in XSwitch. Actually, XSwitch also "invented" a WHXP protocol which support To make long story short, XSwitch has more goodies than FreeSWITCH and it would be quick and easy start from it. As mentioned by winlinvip, both XSwitch and FreeSWITCH doesn't support bundled rtp originally, we had to patch it to make it work. While it works, the patch is not ideal and probably not easy to get merged into upstream FreeSWITCH repo, but we'll opensource it later so everyone can try it. In addition to a lot of patches, a new module is also made, and a new endpoint called The latest stable release already has it, but a new dev version will be released soon which will add SRS hook handling, say, when someone push a new RTC stream into SRS, XSwitch will be notified by the hook and automatically pull that stream into a XSwitch video conference, or MCU. Read https://docs.xswitch.cn/dev/howto/xswitch-srs/ if you can't wait to try it, sorry it's Chinese only atm. Leave comment if you have questions, and XSwitch also has it's own supports page: Chinese and International. We probably will have a open source branch and module next month, vote to make it sonner :). Also don't forget you need the latest SRS, see #3591 for more information. |
Beta Was this translation helpful? Give feedback.
-
Hope the following diagrams help. |
Beta Was this translation helpful? Give feedback.
-
If you got SIP clients to join a meeting, to communicate with WebRTC clients, how to do that?
Because SIP clients only support 1 video and 1 audio stream, some might suport 1 extra screen stream, so you should use MCU to merge all streams in a room.
FS or Freeswitch is a MCU for SIP clients and also supports WebRTC clients, so you can use FS instead.
Sometimes, the vast majority of rooms have no SIP clients, only a small group of rooms should support SIP client. In this situation, SFU is better solution, because MCU requires a huge resource of CPU for encoding.
If you get only one SIP client, others are WebRTC clients like Chrome browsers, you can also use FS as a SIP to WebRTC proxy to connect to SRS like a WebRTC client.
The bellow is the full architecture:
To do this, FS should support pulling WebRTC stream from SRS by WHIP protocol, please see Unity: Player. I think the workflow should be this:
Besides this solution, SRS also clould forward or push WebRTC stream to FS by WHIP, the workflow should be:
Please translate the main points.
a. Chrome A pushes the stream to SRS and then FS receives the callback.
b. FS pulls the stream from SRS using recvonly WHIP for Chrome A.
c. Just like Chrome A, FS also pulls the streams for Chrome B and C.
1.1. FS should have a mixer that sends the stream to SRS using sendonly WHIP. This SRS may not necessarily be the same as the previous one, it can be another SRS.
1.2. Chrome A/B/C can pull this mixed stream, or they can pull streams from each other, depending on the user's strategy.
2.1 FS should also be able to send an RTMP stream, which is the live stream for interactive broadcasting.
TRANS_BY_GPT3
Beta Was this translation helpful? Give feedback.
All reactions