I know that if $g: \mathbb{R}^d \to \mathbb{R}$ is Lipschitz, this means that there exists $L > 0$ such that$$|g(x) - g(y)| \leq L |x - y|,$$for all $x, y \in \mathbb{R}^d$. Therefore, by taking $y = 0 \in \mathbb{R}^d$, we get$$|g(x) - g(0)| \leq L |x|,$$which implies that$$|g(x)| \leq L |x| + |g(0)|,$$for all $x \in \mathbb{R}^d$, meaning that $g$ has linear growth. Now, my question is, what if $g$ is (only) locally Lipschitz? Does this property still hold?
↧