Quantcast
Channel: Active questions tagged real-analysis - Mathematics Stack Exchange
Viewing all articles
Browse latest Browse all 8476

An infinite interval can not be covered by a sequence of intervals of finite total length, proving this without measure theory.

$
0
0

This is about the construction of the Lebesgue measure. This is from a book I am reading, I have highlighted the sentence in read:

enter image description here

Since we have not yet created the Lebegue measure when using this statement, how do we show the sentence I have highlighted in red? If we had the Lebesgue measure the statement would follow very easily. I think it should follow easily without the Lebesgue measure, because the author just mentions it in bypass. But I don't see how to really prove it. I mean intuitively it is very clear. Is it trivial?, if so, could you please show the proof?

I know it can't happen, but how do we know that even if the intervals get smaller and smaller really fast, they can not cover an infinite interval?

Update:

I see that we can extend each interval $(a_n,b_n]$, with $(a_n,b_n+1/2^n)$, and cover a big closed interval $[A,B]$ where $B-A$ is bigger than the the finite total length of the sequence of sets. And then use compactnes to get a finite subcovering and hence a contradiction. But it seems it should follow more easily.


Viewing all articles
Browse latest Browse all 8476

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>