Quantcast
Channel: Active questions tagged real-analysis - Mathematics Stack Exchange
Viewing all articles
Browse latest Browse all 9169

Proving $f \equiv 0$ on $[a,b]$ given $\left|\int_\alpha^\beta f(x)\,dx\right| \leq M|\beta-\alpha|^{s+1}$ for all subintervals

$
0
0

So this problem is from my analysis exam, it states as follows: assume that the real function $f$ is continuous on a finite interval $[a,b]$, and for any subinterval $[\alpha,\beta]\subset[a,b]$, we have $$\left| \int_{\alpha}^{\beta} f(x)\, dx\right|\leq M|\beta-\alpha|^{s+1}$$ where $M,s$ are positive constants, prove that $f(x)\equiv 0,\, x\in [a,b]$,i.e. $f(x)$ vanishes on $[a,b]$. My initial attempts were to use proof by contradiction, i.e. trying to deduce what will happen when there is some point where $f$ does not vanish from the continuity of $f$ on the interval, but this approach ended in vain. So could someone suggest some insightful ideas?


Viewing all articles
Browse latest Browse all 9169

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>