Quantcast
Channel: Active questions tagged real-analysis - Mathematics Stack Exchange
Viewing all articles
Browse latest Browse all 9646

Find $d_k$ that minimize $\sum_{k=1}^\infty d_k\cdot\exp(-(d_1+\cdots+d_{k-1}))$

$
0
0

N.B.: The question has been edited after Sangchul Lee's answer to include an extra condition that is required for the originalproblem to make sense.

Consider a strictly increasing sequence of strictly positive real numbers $d_k$ such that $\lim_{k\rightarrow\infty}d_k=+\infty$. I would like to find their values so that they minimize the series$$\sum_{k=1}^\infty d_k\cdot e^{-(d_1+\cdots+d_{k-1})}.$$

I got to this while trying to solve a problem from Project Euler (so following the rules of that site, I cannot disclose more about it).

I tested some possible values out of intuition, but I'm not able to see a clear path that helps me finding out what should be the minimum.

Example 1:$d_k=\log(k+1)$. Then the series becomes $$\sum_{k=1}^\infty\frac{\log(k+1)}{k!}\simeq 1.55867$$

Example 2:$d_k=k\cdot\log(1+\varepsilon)$ with $\varepsilon>0$. Then the series takes the form $$\log(1+\varepsilon)\sum_{k=1}^\infty\frac{k}{(1+\varepsilon)^{k(k-1)/2}}.$$ Testing some values for $\varepsilon$ suggests that the series decreases as $\varepsilon\rightarrow 0$, but this would contradict the expected existence of a minimum for the total value of the series. In this particular example I'm not able to find a closed form either, but I suspect it may involve Jacobi theta functions.

Any hint on what strategy could be followed will be very welcome.


Viewing all articles
Browse latest Browse all 9646

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>