I’m looking at a way to implement video encoder using web browser. Youtube and Facebook already allow you to go live directly from the web browser. I’m wondering how do they do that?
There are a couple of solutions I’ve researched:
- Using web socket: using web browser to encode the video (using mediarecorder api) and push the encoded video to the server to be broadcast.
- Using WebRTC: web browser as a WebRTC peer and another server as the other end to receive the stream and re-broadcast (transcode) using other means (rtmp, hls).
Is there any other tech to implement this that those guys (YouTube, Facebook) are using? Or they also use one of these things?
Thanks
2
Answers
Correct, you’ve hit on two ways to do this. (Note that for the MediaRecorder method, you can use any other method to get the data to the server. Web Sockets is one way… so is a regular HTTP PUT of segments. Or, you could even use a data channel of a WebRTC connection to the server.)
Pretty much everyone uses the WebRTC method, as there are some nice built-in benefits:
The downsides of the WebRTC method:
If you go the WebRTC route, consider gstreamer. If you want to go the Web Socket route, I’ve written a proxy to receive the data and send it off to FFmpeg to be copied over to RTMP. You can find it here: https://github.com/fbsamples/Canvas-Streaming-Example
WebRTCHacks has a “how does youtube use webrtc” post here which examines some of the technical details of their implementation.
In addition one of their engineers gave a Talk at WebRTC Boston describing the system which is available on Youtube