Crowd-Sourced Subjective Assessment of Adaptive Bitrate Algorithms in Low-Latency MPEG-DASH Streaming
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Video-centric applications have seen significant growth in recent years with HTTP Adaptive Streaming (HAS) becoming a widely adopted method for video delivery. Recently, low-latency (LL) adaptive bitrate (ABR) algorithms have recently been proposed to reduce the end-to-end delay in HTTP adaptive streaming. This study investigates whether low-latency adaptive bitrate (LL-ABR) algorithms, in their effort to reduce delay, also compromise video quality. To this end, this study presents both objective and subjective evaluation of user experience with traditional DASH and low-latency ABR algorithms. The study employs crowdsourcing to evaluate user-perceived video quality in low-latency MPEG-DASH streaming, with a particular focus on the impact of short segment durations. We also investigate the extent to which quantitative QoE metrics correspond to the subjective evaluation results. Results show that the Dynamic algorithm outperforms the low-latency algorithms, achieving higher stability and perceptual quality. Among low-latency methods, LoL+ demonstrates superior QoE compared to L2A-LL, which tends to sacrifice visual consistency for latency gains. The findings emphasize the importance of integrating subjective evaluation into the design of ABR algorithms and highlight the need for user-centric and perceptually aware optimization strategies in low-latency streaming systems. Our results show that the subjective scores do not always align with objective performance metrics. The viewers are found to be sensitive to complex or high-motion content, where maintaining a consistent user experience becomes challenging despite favorable objective performance metrics.