Quantcast
Channel: Active questions tagged real-analysis - Mathematics Stack Exchange
Viewing all articles
Browse latest Browse all 9262

Uniform Convergence Given Monotone Sequence

$
0
0

I've been working on this problem for a few days and it is a bit frustrating, especially because I know the solution is probably right in front of me.

Problem. Suppose $f_n:[a,b]\to\mathbb{R}$ is a sequence of continuous functions converging pointwise to a continuous $f:[a,b]\to\mathbb{R}$. Suppose that $\forall{x}\in[a,b]$, the sequence $(|f_n(x)-f(x)|)_{n=1}^\infty$ is monotone. Prove $f_n$ converges uniformly to $f$.

If the sequence is increasing it is not very hard to show that it must converge uniformly. Given an $x\in[a,b]$, by pointwise convergence $(\forall\varepsilon>0)(\exists{M\in\mathbb{N}})(\forall{n\geq{M}})(|f_n(x)-f(x)|<\varepsilon)$, because $(|f_n(x)-f(x)|)_{n=1}^\infty$ is increasing if $|f_k(x)-f(x)|\neq{0}$ for some $k$ then $(\forall{n\geq{k}})(|f_n(x)-f(x)|\geq|f_k(x)-f(x)|)$ which contradicts the definition of pointwise convergence. So $(\forall{x}\in[a,b])(\forall{n}\in\mathbb{N})(f_n(x)=f(x))$, which means $f_n$ converges uniformly to $f$.

The part I'm struggling on is for a decreasing sequence. I can show that that if $(|f_n(x)-f(x)|)_{n=1}^\infty$ is decreasing for every $x\in[a,b]$ then the sequence given by $$y_n=\max(|f_n(x)-f(x)|:x\in[a.b])$$ must converge by monotone convergence theorem but I don't know how to show it converges to 0.

Any hints would be appreciated as well.


Viewing all articles
Browse latest Browse all 9262

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>