Quantcast
Channel: Active questions tagged real-analysis - Mathematics Stack Exchange
Viewing all articles
Browse latest Browse all 9745

Differentiation under the integral sign

$
0
0

Let $\delta > 0$, $u \in L^1([-\delta,\delta])$ and a function $k: \mathbb{R} \rightarrow \mathbb{R}$ which satisfies the following assumptions:

  1. $k \in L^1(\mathbb{R})$.
  2. $k$ is compactly supported on $\left[ -\delta, \delta\right]$.
  3. The derivative $k'$ exists everywhere on $(-\delta, \delta) =: A$.
  4. The derivative $k'$ is bounded on $A$, that is $\left\| k' \right\|_{\infty,A}<+\infty$.

Take $x \in [0, \delta)$. The question is: is it possible, under these assumptions, to prove the existence of:

$$\frac{d }{dx}\left( \int_{0}^{x}u(y)k(y-x+\delta)dy \right)$$

If not, what is the set of minimal assumptions on $k$ under which we can differentiate the above function? I can prove the existence of the derivative when assumption $(3.)$ is substituted by: $(3'.)$$k \in C^1(\mathbb{R})$; but this forces $k(\delta) = 0$ by continuity of $k$ which is not what I want for my purposes.

Thank you.


Viewing all articles
Browse latest Browse all 9745

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>