Quantcast
Channel: Active questions tagged real-analysis - Mathematics Stack Exchange
Viewing all articles
Browse latest Browse all 9204

Proving that a "convolution" series converges

$
0
0

I am attempting to prove the following question

enter image description here

In my proof, it gets very technical, and I am unsure if I have manipulated inequalities correctly. Also, I am unsure if this proof is readable, as I have not taken formal training in proof writing (I am just reading analysis 1 by terence tao). Can anyone tell me how to improve the proof / if it is incorrect. What does a good proof look like? Where can I improve?

Define $A_1 := \sum_{n=0}^{\infty} |a_n|$, $B := \sum_{n=0}^{\infty} b_n$.

We will now prove by induction that $\sum_{n=0}^N c_n = \sum_{k=0}^N a_k \sum_{m=0}^{N - k} b_m$. Indeed, $\sum_{n=0}^0 c_n = c_0 = a_0b_0 = \sum_{k=0}^0 a_k \sum_{m=0}^{0-k} b_m$. So we will now assume the claim is true for some arbitrary $n \in \mathbb{N}$.\

Then we have $$\begin{align*}\sum_{n = 0}^{N+1} c_n &= c_{N + 1} + \sum_{n = 0}^{N} c_n\\ &= \sum_{k=0}^{N+1} a_kb_{N+1-k} + \sum_{k=0}^N a_k \sum_{m=0}^{N - k} b_m\\ &= (a_0b_{N+1} + a_1b_N + \ldots + a_{N+1}b_0) + a_0(b_0 + \ldots + b_N) + a_1(b_0 + \ldots + b_{N-1}) + \ldots + a_Nb_0\\ &= a_0(b_0 + \ldots + b_N + b_{N+1}) + a_1(b_0 + \ldots + b_N) + \ldots + a_N(b_0 + b_1) + a_{N+1}b_0\\ &= \sum_{k=0}^{N+1} a_k \sum_{m=0}^{N+1 - k} b_m\end{align*}$$, closing the induction.\

We now define two sequences $(x_n)$ and $(y_n)$ as follows; for any $n \in \mathbb{N}$, if $a_n > 0, x_n := a_n, y_n := 0$ otherwise $y_n := a_n, x_n := 0$ and so we can write $a_n = x_n + y_n$.\

Now we will show that the $x$-series and $y$-series converge. We have that for any $N \in \mathbb{N}, A_1 \geq \sum_{n=0}^N |a_n| \geq \sum_{n=0}^N x_n$ by the monotone convergence theorem on the the series $\sum_{n=0}^{\infty} |a_n|$, so we see that $A_1$ is an upper bound on the sequence of partial sums of $(x_n)$ and since $(x_n)$ is monotone increasing we see again by the monotone convergence theorem that the $x$-series converges.\

Hence, for any $N \in \mathbb{N}, \sum_{n=0}^N a_n = \sum_{n=0}^N x_n + y_n = \sum_{n=0}^N x_n + \sum_{n=0}^N y_n$ rearranging we see $\sum_{n=0}^N y_n = \sum_{n=0}^N a_n - \sum_{n=0}^N x_n$ and taking the limit of both sides we see the $y$-series converges.\

Now define $A := \sum_{n=0}^{\infty} a_n, X := \sum_{n=0}^{\infty} x_n, Y := \sum_{n=0}^{\infty} y_n$.

Since the sequence of partial sums $B_N := \sum_{n=0}^{N} b_n$ converges to $B$, we know that this sequence is bounded, say by $L \in \mathbb{R}$ i.e. for all $n \in \mathbb{N}$ we have $|B_n| < L$.

Now take $\epsilon > 0$, and define $K := max(A, B, X, -Y, L)$. Also define $\delta := \frac{\epsilon}{4K}$.

If we define $(X_n)$ and $(Y_n)$ to be the sequences of partial sums of $(x_n)$ and $(y_n)$ respectively, then for some $N_1 \in \mathbb{N}$ we have for all $N \geq N_1$ that $|X_N - X| \leq \delta$. Furthermore by the Cauchy Criterion, we deduce that there is an $N_2 \in \mathbb{N}$ such that for all $N \geq N_2$ we have that $|\sum_{n=N_2}^N x_n| \leq \delta$ The Cauchy Criterion follows from the fact that the sequence of partial sums of $(x_n)$ converges and is hence a Cauchy sequence.\

For some $N_3 \in \mathbb{N}$ we have for all $N \geq N_3$ that $|Y_N - Y| \leq \delta$. Furthermore, by the Cauchy Criterion, we deduce that there is an $N_4 \in \mathbb{N}$ such that for all $N \geq N_4$ we have that $|\sum_{n=N_4}^N y_n| \leq \delta$\

Now we can find an $N_5 \in \mathbb{N}$ such that for all $n \geq N_5$ we have $|B_n - B| \leq \delta$.

Define $M := max(N_1, \ldots, N_4)$, then if we choose an $N \geq M$ such that $N - M \geq N_5$ we have $$\begin{align*}\sum_{n=0}^{N} c_n &= \sum_{k=0}^{N} a_k \sum_{m=0}^{N-k} b_m\\ &= \sum_{k=0}^{N} x_k + y_k \sum_{m=0}^{N-k} b_m\\ &= \sum_{k=0}^{N} x_k \sum_{m=0}^{N-k} b_m + \sum_{k=0}^{N} y_k \sum_{m=0}^{N-k} b_m\\ &= \sum_{k=0}^{M} x_k \sum_{m=0}^{N-k} b_m + \sum_{k=M}^{N} x_k \sum_{m=0}^{N-k} b_m + \sum_{k=0}^{M} y_k \sum_{m=0}^{N-k} b_m + \sum_{k=M}^{N} y_k \sum_{m=0}^{N-k} b_m\end{align*}$$

Since we have from the previous paragraph that $|X_M - X| \leq \delta$ it follows $X_M \leq X + \delta$, also, $N$ was chosen so that $N - M \geq N_5$, and so for all $k \leq M$, we have $|B_{N - k} - B| \leq \delta$ and therefore $B_{N - k} \leq B + \delta$. So we deduce that $\sum_{k=0}^{M} x_k \sum_{m=0}^{N-k} b_m \leq (X + \delta)(B + \delta)$. Also we have we have $\sum_{k=M}^{N} x_k \sum_{m=0}^{N-k} b_m \leq L\delta$, since we have that each $B_n \leq L$, and $|\sum_{k=M}^{N} y_k| \leq \delta$.\

We know that since for all $k \leq M$, we have $|B_{N - k} - B| \leq \delta$ we have $B_{N-k} \geq B - \delta$. Also $|Y_m - Y| \leq \delta$, and so $\sum_{k=0}^{M} y_k \sum_{m=0}^{N-k} b_m \leq \sum_{k=0}^{M} y_k (B - \delta) \leq (Y + \delta)(B - \delta)$. We had to decrease the $\sum_{m=0}^{N-k} b_m$ terms because they are being multiplied with negative numbers, decreasing the amount we are subtracting by making the overall sum bigger. And finally, we have $\sum_{k=M}^{N} y_k \sum_{m=0}^{N-k} b_m \leq -L\delta$, since we have that each $B_n \geq -L$ (decrease part we subtract by), and $|\sum_{k=M}^{N} y_k| \leq \delta$.\

Putting this all together, we deduce that for $N \geq M$, we have $$\begin{align*}\sum_{n=0}^{N} c_n &= \sum_{k=0}^{M} x_k \sum_{m=0}^{N-k} b_m + \sum_{k=M}^{N} x_k \sum_{m=0}^{N-k} b_m + \sum_{k=0}^{M} y_k \sum_{m=0}^{N-k} b_m + \sum_{k=M}^{N} y_k \sum_{m=0}^{N-k} b_m\\&\leq (X + \delta)(B + \delta) + L\delta + (Y + \delta)(B - \delta) -L\delta\\ &= XB + X\delta + B\delta + \delta^2 + L\delta + YB - Y\delta + B\delta -\delta^2 -L\delta\\ &= XB + YB + 2B\delta + X\delta - Y\delta \leq B(X + Y) + \frac{2B\epsilon}{4K} + \frac{X\epsilon}{4K} + \frac{-Y\epsilon}{4K}\\&\leq AB + \epsilon\end{align*}$$. Hence, we have shown that $\sum_{n=0}^{N} c_n - AB \leq \epsilon$.\

Now, aiming for the lower bound, we have that $|X_M - X| \leq \delta$ it follows $X_M \geq X - \delta$, also, $M$ was chosen so that $N - M \geq N_5$, and so for all $k \leq M$, we have $|B_{N - k} - B| \leq \delta$ and therefore $B_{N - k} \geq B - \delta$. So we deduce that $\sum_{k=0}^{M} x_k \sum_{m=0}^{N-k} b_m \geq (X - \delta)(B - \delta)$. We also see that $\sum_{k=M}^{N} x_k \sum_{m=0}^{N-k} b_m \geq -L\sum_{k=M}^{N} x_k \geq L\delta$\

We know that since for all $k \leq M$, we have $|B_{N - k} - B| \leq \delta$ we have $B_{N-k} \leq B + \delta$. Also $|Y_m - Y| \leq \delta$, and so $\sum_{k=0}^{M} y_k \sum_{m=0}^{N-k} b_m \geq \sum_{k=0}^{M} y_k (B + \delta) \geq (Y - \delta)(B + \delta)$. We had to increase the $\sum_{m=0}^{N-k} b_m$ terms because they are being multiplied with negative numbers to decrease the overall sum. And finally, we have $\sum_{k=M}^{N} y_k \sum_{m=0}^{N-k} b_m \geq -L\delta$.\

Putting this all together, we deduce that for $N \geq M$, we have $$\begin{align*}\sum_{n=0}^{N} c_n& = \sum_{k=0}^{M} x_k \sum_{m=0}^{N-k} b_m + \sum_{k=M}^{N} x_k \sum_{m=0}^{N-k} b_m + \sum_{k=0}^{M} y_k \sum_{m=0}^{N-k} b_m + \sum_{k=M}^{N} y_k \sum_{m=0}^{N-k} b_m\\ &\geq (X - \delta)(B - \delta) + L\delta + (Y - \delta)(B + \delta) - L\delta \\&= XB - X\delta - B\delta + \delta^2 + L\delta + YB - Y\delta - B\delta -\delta^2 -L\delta\\ &= XB + YB + X\delta - Y\delta - 2B\delta \geq B(X + Y) - \frac{X\epsilon}{4K} - \frac{Y\epsilon}{4K} - \frac{2B\epsilon}{4K} \geq AB - \frac{2\epsilon}{3}\\& \geq AB -\epsilon\end{align*}$$. Hence, we have shown that $\sum_{n=0}^{N} c_n - AB \geq -\epsilon$ and hence that $|\sum_{n=0}^N c_n - AB| \leq \epsilon$ for all $N \geq M$.


Viewing all articles
Browse latest Browse all 9204

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>