Optimal rate and input format control for content and context adaptive video streaming

T. Ozcelebi, A.M. Tekalp, M.R. Civanlar

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

3 Citations (Scopus)

Abstract

A novel dynamic programming based technique for optimal selection of input video format and compression rate for video streaming based on "relevancy" of the content and user context is presented. The technique uses context dependent content analysis to divide the input video into temporal segments. User selected relevance levels assigned to these segments are used in formulating a constrained optimization problem, which is solved using dynamic programming. The technique minimizes a weighted distortion measure and the initial waiting time for continuous playback under maximum acceptable distortion constraints. Spatial resolution and frame rate of input video and the DCT quantization parameters are used as optimization variables. The technique is applied to encoding of soccer videos using an H.264 (T. Wiegand et al., May 27, 2003) encoder. The improvements obtained over a standard H.264 implementation are demonstrated by experimental results.
Original languageEnglish
Title of host publicationProceedings IEEE International Conference on Image Processing (ICIP 2004)
Place of PublicationPiscataway
PublisherInstitute of Electrical and Electronics Engineers
Pages2043-2046
Number of pages4
ISBN (Print)0-7803-8554-3
DOIs
Publication statusPublished - 2004
Externally publishedYes
Event11th IEEE International Conference on Image Processing (ICIP 2004) - Singapore, Singapore
Duration: 24 Oct 200427 Oct 2004
Conference number: 11

Conference

Conference11th IEEE International Conference on Image Processing (ICIP 2004)
Abbreviated titleICIP 2004
Country/TerritorySingapore
CitySingapore
Period24/10/0427/10/04

Fingerprint

Dive into the research topics of 'Optimal rate and input format control for content and context adaptive video streaming'. Together they form a unique fingerprint.

Cite this