Quantcast
Channel: Active questions tagged real-analysis - Mathematics Stack Exchange
Viewing all articles
Browse latest Browse all 8476

Semidifferentiability at the extremum of an interval and continuous extension of derivative

$
0
0

Let $f:[a,b] \to \mathbb{R}$ be differentiable on $(a,b)$ with continuous derivative $f'$.

(i) Assuming that $f'$ can be continuously extended at $a$, is it true that $f$ is semidifferentiable at $a$ and that

$$\lim_{x\to a^+} f'(x) = f'_+(a)?$$

While this seems reasonable, it looks to me that one would need some sort of "uniform differentiability" to hold. By Taylor's expansion at $x$, for all $x$ close to $a$, we have

$$f(a) = f(x) + f'(x)(a-x) + o(a-x).$$

Therefore,

$$\lim_{x\to a^+} f'(x) = \lim_{x\to a^+}\left( \frac{f(a)-f(x)}{a-x} + o(1)\right)$$

though the $o(1)$depends on the choice of $x$, so that the above might converge to $f'_+(a) + c$, being $c$ some nonzero constant.

The question then is, what can we say about (i)? Is it true? If it is, how do I avoid the issue of the remainder depending on the choice of the point? If it is not, can you give me a counterexample?


Viewing all articles
Browse latest Browse all 8476

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>