Event Detail

Event Type: 
Department Colloquium
Date/Time: 
Monday, November 30, 2009 - 08:00
Location: 
Kidd 364

Speaker Info

Institution: 
Swansea University
Abstract: 

This talk addresses the issue of the proof of the
entropy power inequality, an important tool in the analysis of
Gaussian channels of information transmission, proposed by
Shannon. This inequality has many relations with different
problems in geometry, linear algebra, analysis, etc.

We analyze continuity properties of the mutual entropy
of the input and output signals in an additive memoryless channel
and show how this can be used for a correct proof of the entropy-power
inequality.

To introduce the entropy power inequality,
consider two independent random variables $X_1$ and $X_2$
taking values in $\R^d$, with probability density functions
$f_{X_1}(x)$ and $f_{X_2}(x)$, respectively, where
$x\in\R^d$. Let $h(X_i)$, $i=1,2$ stand for the differential
entropies
$$h(X_i)=-\int_{\R^d}f_{X_i}(x)\ln\;f_{X_i}(x){\rm d}x:=
-\E\ln\;f_{X_i}(X_i),$$
and assume that $-\infty $$e^{\frac{2}{d}h(X_1+X_2)}\geq e^{\frac{2}{d}h(X_1)}+
e^{\frac{2}{d}h(X_2)}.)$$