HLS & MPEG-DASH Protocols Latency Explained
The world of live streaming has seen an ongoing debate about latency reduction. Upon the arrival of the various low-latency streaming solutions, we want to take some time (ironically) to focus at latency a bit closer. While the discussion has revolved primarily around ultra-low latency solutions employing mainly WebRTC and WebSocket technologies, the “traditional” HTTP-based streaming protocols are still used predominantly for live streaming use cases without any interactive (chat, betting etc.) or real-time elements (video calls). This article discusses what latency is, where and why it is introduced and, most importantly how CDN77 is able to help cut down the latency of HLS and MPEG-DASH streams under 10 seconds.
From the mere CDN perspective, we used to think of latency as the time it takes for a packet to travel from the origin server to its final destination, the so-called round-trip time. Let's call this phenomenon the network latency. Measured, ideally, in milliseconds, the network latency depends on various factors such as the speed of light, physical distance, transmission rates, network structures and peering policies of autonomous systems involved, last but not least, the network edge infrastructures including our home and mobile internet connections.
Glass-to-glass latency
Despite its complexity, network latency is just one of the many aspects of entire live streaming latency or, as it is often called, glass-to-glass latency. The latter term refers to the time beginning from the moment an image is captured through the glass lens of a camera until it’s played back on the glass screen of your device.
To understand where and why the latency is introduced, let's follow the video signal a bit closer.
The glass-to-glass latency countdown begins the very moment when the camera captures an image of our analog world and turns it into a continuous flow of digital data. At the initial stage, the data transfers require large bandwidth capacities (can be Gbit/s and larger). Therefore, the next natural step is to compress the signal using codecs and scale the video bitrates down to values suitable for internet transfer. This process is also known as encoding.
Now, when the stream can be transferred over IP, we need to think how to deliver it to viewers effectively. Back in the day, the flash-based RTMP protocol was used to serve this purpose. With Flash being no longer supported by browsers and mobile, however, the HTTP-based workflow was introduced and RTMP is used primarily only as a transport protocol between the encoder and the next unit of the live streaming process - the streaming server.
For HTTP-based protocols, the media server breaks the content into small file segments (also called chunks). Each of these segments contains a short interval of the content, normally ranging between 2 to 10 seconds. These chunks are packed within a manifest file which contains the metadata mapping the particular time interval to a specific file segment.
At the same time, the server creates copies of the stream in different bitrates and resolution (transcoding) to adapt the stream quality based on users' device and network performance. From there on, the segmented stream travels to global CDN edges where it gets cached and served to users through a player.
Each of these units adds certain latency and widens the time gap between the capture and the playback. More specifically, it's a combination of the network latency (encoder > streaming server and the CDN > player), transcoding delay (~5s) and the buffer that both the streaming server and the player create to be able to distribute and play back the stream seamlessly. The latter depends mainly on the chunk duration and may take as much as 10 seconds per each unit. Altogether, the standard HTTP-based protocols offer glass-to glass latency of 25-40 seconds.
Low latency. High quality.
One might ask why HLS and MPEG-DASH have managed to stick around at times where sub-second latency solutions are being introduced. The answer lies in their easy scalability, wide support across platforms and devices and playback availability without the need of using proprietary applications (player etc.).
Simply put, the efforts to bring a smooth and lag-free video watching experience to viewers has taken its toll on the overall latency. To challenge this paradigm, Streamflow by CDN77 has partnered with Bradmax Player. Together, we’ve gone through the entire process and we have created a live streaming solution that offers the glass-to-glass latency of HLS and MPEG streams to be lower than 10 seconds while maintaining the uncompromised quality of video delivery.
If you’d like to test the low latency HTTP-based streaming powered by our 14Tbps + global CDN, do not hesitate to get in touch with us to get your free trial.
Head of Sales & Client Solutions