Web Video Needs a Real-time Broadcast Solution

Blog 4 min read | Aug 3, 2018 | JW Player

Share:

Perspectives on the digital video world from JW Player’s SVP of Technology

First, let’s clarify the difference between live and real-time video delivery.

Live video is streamed to viewers as an event happens, as opposed to on-demand video, which is prerecorded and streamed to viewers whenever they choose.

HLS and DASH, the two leading streaming protocols in use on the web, can operate in a “live” mode, but they are not real-time protocols. Both require a certain amount of video data to be buffered in the player to guard against network rebuffering and other playback glitches. These buffers create latency. In the case of Apple’s HLS, their own guidelines mandate at least three segments of six seconds each, effectively putting each viewer about 20 seconds behind “real” time.

 

Real-time live video protocols, by comparison, are designed to operate with extremely low latency (i.e., the gap in time between when an event happens and the viewer sees it). Their latency is so low, it is measured in milliseconds instead of seconds.

 

The Legacy of Flash and RTMP

For many years, if you wanted to deliver real-time video to web browsers, there was only one way to do it: Flash.

 

But Flash is a dying technology, and RTMP (the protocol that Flash uses for real-time video) is not supported in any web browser. Moreover, it is unlikely to ever be supported in browsers due to its reputation as outdated technology, not to mention Adobe’s strange “open specification” RTMP license and choice to exclude the encrypted variant of the protocol from the spec, which is a nonstarter as the web moves to encrypted transports by default.

 

This effectively means that there is no real-time video protocol for the web. This is a step backward for the platform, as it makes formerly viable use cases like (truly) live sports, breaking news and gaming impossible. For example, imagine that you are a website that has been happily streaming horse races using Flash and RTMP for the past ten years. Very soon, when Flash is gone, you will no longer be able to stream those races in true realtime, which has ramifications. No matter what your opinion of horse racing (or Flash), this kind of regression in functionality is bad for the Web.

 

Current Proposed Solutions Can’t Get Us There

Several technical solutions for real-time video without Flash have been proposed, such as HTTP chunked encoding, WebRTC and Websockets. Patrick Dubois at Zender.tv has written a great overview of them.

 

All of these approaches are very creative, but they are essentially hacks of existing technologies that were not designed for low-latency video streaming. As such, they run into problems of scalability and lack of interoperability among vendors.

 

The BBC is doing some very interesting work with HTTP Server Push, but it is still in the proof-of-concept phase and would require new web APIs to work natively in browsers.

 

Is a New Protocol the Answer?

The Secure Reliable Transport (SRT) protocol began as a collaboration between Wowza and Haivision and has grown into the SRT Alliance. SRT is a true real-time protocol and therefore a contender to replace RTMP.

 

The trouble is, SRT is not supported natively in any web browser. To stream SRT, viewers must install dedicated SRT client software, certainly not ideal in today’s plugin-less web. But what if we pushed Apple, Google, and Mozilla to implement SRT? They are very unlikely to do that, because SRT is not a true standard, and it would take significant effort to get buy-in and deployment of SRT throughout the delivery ecosystem (servers, CDNs, etc.).

 

WebRTC to the Rescue (Maybe)

Of all the options, I think WebRTC is the most promising. It has been supported in all the major browsers for years, has throngs of active developers, and has been standardized (well, mostly) by the W3C and IETF.

 

Unfortunately, the current version of the WebRTC specs are not designed for one-to-many use cases. WebRTC as it exists today requires every peer (i.e., device) in a session to have a network connection to every other peer. This makes perfect sense for WebRTC’s intended use case of real-time communications (video chat, conferencing, etc.), but for one-to-many, broadcast-type scenarios in which you need to delivery millions of streams, it simply doesn’t scale.

 

However, there may be an opportunity to add “broadcast” functionality to the next version of the WebRTC spec, which is under development at the W3C and IETF. I hope to have more to report on those efforts in the coming months.

 

John Luther is SVP of Technology at JW Player.

 

Contact Us