In the paper "A Deep Generative Approach to Conditional Sampling", the author writes in the proof of Theorem 4.1:
Since$$\Vert G^* - \bar{G}_\theta \Vert_{L^\infty(E_1)}\to 0, \quad \text{as } n \to \infty $$
Let $\bar{D} = \log\frac{p_{X, \bar{G}_\theta(\eta,X)}(z)}{p_{X, Y}(z)}$, $D^* = \log\frac{p_{X, G^*(\eta,X)}(z)}{p_{X, Y}(z)}$. Then
$$ \Vert \bar{D} - D^*\Vert \to 0, \quad \text{as } n \to \infty$$by the continuity.
To give more context, our goal is to find $\bar{G}_\theta(\eta, X)$ ,the generator, s.t. $(X, \bar{G}_\theta(\eta, X)) \sim (X, Y)$, i.e., they have the same joint distribution. By matching the joint distribition, we also matched the conditional distribition:$\bar{G}_\theta(\eta, x) \sim Y|X=x$, where $\eta$ is a random variable independent of $X$ by the noise-outsourcing lemma.
I am having trouble following how $\Vert \bar{D} - D^*\Vert \to 0$"by the continuity". Does it mean that, if $\Vert f_n - f\Vert_{L^\infty} \to 0$, then $\Vert p_{X, f_n(\eta, X)} - p_{X, f(\eta, X)} \Vert \to 0$?