Quantcast
Channel: Active questions tagged real-analysis - Mathematics Stack Exchange
Viewing all articles
Browse latest Browse all 9822

$L^{\infty}$ (uniform) decay of Dirichlet heat equation $u_t=\Delta u$

$
0
0

It is known that the heat PDE$$\frac{\partial u}{\partial t}(t,x)=\Delta u(t,x)$$ on a smooth bounded open subset $\Omega\subset\mathbb{R}^N$ and $u=0$ on the boundary $\partial \Omega$, has a smooth solution $u(t,x)$.

Multiply by $u$:$$uu_t -u\Delta u = 0$$Integrate$$\int uu_t -\int u\Delta u = 0$$Using Green's theorem$$\frac{1}{2}\frac{d}{dt}\int u^2+\int |\nabla u|^2= 0$$

Move the second term onto the RHS and using Poincaré's inequality$$\frac{1}{2}\frac{d}{dt}\int u^2=-\int |\nabla u|^2 \leq -\lambda_1\int u^2$$where $\lambda_1>0$ is the smallest eigenvalue of the (negative) Laplacian. This can also be written as

$$\frac{d}{dt}|u(t)|^2_{L^2(\Omega)} \leq -2\lambda_1|u(t)|^2_{L^2(\Omega)}$$By solving this differential inequality (for example Gronwall's Lemma) we get$$|u(t)|^2_{L^2(\Omega)} \leq |u(0)|^2_{L^2(\Omega)}e^{-2\lambda_1 t}$$ which means$$|u(t)|_{L^2(\Omega)} \leq |u(0)|_{L^2(\Omega)}e^{-\lambda_1 t},$$ this implies that $$\lim_{t\to \infty}|u(t)|_{L^2(\Omega)}=0$$ which means the heat goes to zero in the ${L^2(\Omega)}$ norm.

My question is how do we prove that$$\lim_{t\to \infty}\sup_{x\in \Omega}|u(t,x)|=0$$


Viewing all articles
Browse latest Browse all 9822

Latest Images

Trending Articles



Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>