Quantcast
Channel: Active questions tagged real-analysis - Mathematics Stack Exchange
Viewing all articles
Browse latest Browse all 9335

Continuity of optimizers of the Hopf--Lax formula

$
0
0

Let $f:\mathbb{R}^d\to \mathbb{R}$ be smooth and convex with at most linear growth (you can assume Lipschitzness if it simplifies). Fix any $t>0$. It is clear that a maximizer of the following Hopf--Lax type formula exists:

$$\sup_{x\in\mathbb{R}^d}\left\{f(x) - \frac{|x|^2}{t}\right\}.$$

Fix an arbitrary maximizer $x_0$ of this formula.

Question: Does there exist a sequence $(x_\epsilon)_{\epsilon>0}$ such that

  1. for each $\epsilon>0$, $x_\epsilon$ is a maximizer of$$\sup_{x\in\mathbb{R}^d}\left\{f(x) - \frac{|x|^2}{t+\epsilon}\right\},$$

  2. and $x_\epsilon$ converges to $x_0$ as $\epsilon\to 0$?


Viewing all articles
Browse latest Browse all 9335


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>