Given positive, by one bounded sequences $(a_k)_{k>0}, (b_k)_{k>0}$ and for some $x\in[0,1] \, f(y) = x(1-x) + y (1-2x)^2 $. for $\sum_{l>0} \lambda_l = 1$ I want to show that $\sum_{k>0}\frac{a_k}{f(b_k)} \geq \sum_{k>0} \frac{a_k}{f(\sum_{l>0} \lambda_l b_l)}$Numerically this is valid.But it is not valid that $ \forall k \, b_k \leq \sum_{l>0} \lambda_l b_l$ which would have been a sufficient condition.Note that $f$ is monotonically increasing and linear in its argument, that is $f(\sum_l \lambda_l b_l) = \sum_l \lambda_l f(b_l)$.Concretely, $a_k = \frac{(p_i -p_{-i})^2}{p_i+p_{-i}}$ and $b_k = \frac{p_i p_{-i}}{p_i+p_{-i}}$, where $(p_i)_{i\in I}$ is a probability distribution and $I = \{-l,\dots, l\}$ for some integer $l$
↧