I am trying to prove $$X_n \xrightarrow{d} X, Y_n \xrightarrow{d} a \implies Y_n X_n \xrightarrow{d} aX$$where $a$ is a constant.
What I tried:
Let $g:\mathbb R\to \mathbb R$ an arbitrary uniformly continuous, bounded function. It suffices to show $\mathbb E[g(Y_n X_n)] \to \mathbb E[g(aX)]$. We have $$\left \lvert \int g(Y_nX_n) - g(aX) \,dP \right \rvert \leq \left \lvert \int g(Y_n X_n) - g(aX_n) \, dP \right \rvert +\left \lvert \int g(aX_n) - g(aX) \, dP \right \rvert,$$where the right summand goes to $0$ by assumption as $a$ is constant. Now I want to use uniform continuity of $g$ to estimate the left summand: Choose $\delta > 0 $ such that $$\left \lvert g(Y_n X_n) - g(aX_n) \right \rvert < \epsilon,$$whenever $|Y_n X_n - aX_n| < \delta.$ Since convergence in distribution to a constant implies convergence in probability, we have can control $P(|Y_n - a | > \delta)$. But is it possible to get a bound on $|X_n|$? I assume that probability in distribution does not imply some form of boundedness... Any help is appreciated.
↧
$X_n \to X, Y_n \to c$ in distribution implies $X_n Y_n \to Xc$ in distribution
↧