Archive 2019-07

Variational Form for H(x)

July 29, 2019 · Xingyu Li · DeepInfoFlow · information theory math variational form

For continuum random variable $X$, the differential entropy id defined as $$\int P(x)\log\dfrac{1}{P(x)}\,\text{d}x,\tag{1}$$ where $P(x)$ is the distribution for $X$. function $-\log v$ is strictly convex, let $\phi$ be its conjugate dual function, then $-\log v = \underset{u\in\mathbb{R}}{\text{sup}}\left[uv - \phi(u)\right]$, $\phi(u) = -1 - \log(-u)$ for $u<0$ and $+\infty$ otherwise. using above, we have $$\begin{aligned} \int P(x)\log\frac{1}{P(x)}\,\text{d}x &= - \int P(x)\left(- \log\frac{1}{P(x)}\right)\,\text{d}x \\&= -\int P(x) \underset{f}{\text{sup}}\left[ f(x)\frac{1}{P(x)} - \phi(f) \right]\,\text{d}x \\&= -\underset{f}{\text{sup}}\left[ \int f(x)\,\text{d}x - \int \phi(f)P(x)\,\text{d}x\right] \\&= -\underset{f}{\text{sup}}\left[ \int f(x)\,\text{d}x - \mathbb{E}_{P}[\phi(f)]\right]\end{aligned}$$

Read More