Non-asymptotic Information Theory
Oct 31, 2011
from 01:00 PM to 02:00 PM
|Where||54-134 Engineering IV Building|
|Add event to calendar||
In real-time voice and high-speed data applications, limited delay is a key design constraint; indeed, packet sizes as short as a few hundred bits are common in wireless systems. Traditional results on the fundamental limits of data compression and data transmission through noisy channels apply to the asymptotic regime as delay (or blocklength) goes to infinity. In this talk, I review our recent progress on the analysis of the fundamental limits as a function of blocklength. Going beyond traditional refinements to the fundamental asymptotic information theoretic limits, we investigate the backoff from capacity (in channel coding) and the overhead over entropy (in lossless compression) and the rate-distortion function (in lossy source coding) incurred by coding at a given blocklength. Requiring new proof techniques transcending traditional ones, our approach has dual components: computable upper/lower bounds tight enough to reduce the uncertainty on the non-asymptotic fundamental limit to a level that is negligible compared to the gap to the long-blocklength asymptotics; and analytical approximations to the bounds that are accurate even for short blocklengths.
For additional information please contact Prof Puneet Gupta (email@example.com), Prof Suhas Diggavi (firstname.lastname@example.org)