
Can one revisit the main concepts of information theory in a deterministic setting? Shortly after Shannon’s work came about, this was the program set by the great Soviet mathematician Andrey Kolmogorov. In this talk, we review Kolmogorov’s program and cast his results in the context of square-integrable, band-limited signals subject to perturbation. For this class of signals, spectral concentration properties are well known, and closed form formulas can be obtained. We also introduce The (ε,δ)-capacity, that extends the Kolmogorov ε-capacity to packing sets in high dimensions, and compare it to the Shannon capacity. The functional form of the results indicates that in both Kolmogorov and Shannon's settings, capacity and entropy grow linearly with the number of degrees of freedom,but only logarithmically with the signal to noise ratio. This insight transcends the details of the stochastic or deterministic description of the information-theoretic model. For δ=0, results lead to new bounds on the Kolmogorov ε-capacity, and a tight asymptotic expression of the Kolmogorov ε-entropy of band-limited signals. Finally, we spend a few words on the generalization to signals of multiple variables, apt at modeling space-time fields.
This is joint work with Taehyung J. Lim.