Real-time and rate–distortion optimized video streaming with TCP

作者:

Highlights:

摘要

In this paper we explore the use of a new rate–distortion metric for optimizing real-time Internet video streaming with the transmission control protocol (TCP). We lay out the groundwork by developing a simple model that characterizes the expected latency for packets send with TCP-Reno. Subsequently, we develop an analytical model of the expected video distortion at the decoder with respect to the expected latency for TCP, the packetization mechanism, and the error-concealment method used at the decoder. Characterizing the duo protocol/channel more accurately, we obtain a better estimate of the expected distortion and the available channel rate. This better knowledge is exploited with the design of a new algorithm for rate–distortion optimized encoding mode selection for video streaming with TCP. Experimental results for real-time video streaming depict improvement in PSNR in the range of 2 dB over metrics that do not consider the behavior of the transport protocol.

论文关键词:Video streaming,TCP,Distortion model

论文评审过程:Received 5 February 2006, Revised 30 December 2006, Accepted 8 January 2007, Available online 26 January 2007.

论文官网地址:https://doi.org/10.1016/j.image.2007.01.001