Let $\{x_i\}_{i=1}^N$ and $\{y_i\}_{i=1}^N$ be real numbers in the interval (0,1). Define for each $i$$$\alpha_i = x_i (1-x_i) \log^2 \frac{x_i (1-y_i)}{y_i (1-x_i)}$$and$$\beta_i = x_i \log \frac{x_i}{y_i} + (1-x_i) \log \frac{1-x_i}{1-y_i}.$$You may recognize $\beta_i$ as a Kullback-Leibler divergence between two Bernoulli distributions, though it's not clear to me that this is useful.
I conjecture that$$\frac{\sum_{i=1}^N \alpha_i}{\left(\sum_{i=1}^N \beta_i\right)^2} \le \left(\sum_{i=1}^N (x_i-y_i)^2\right)^{-1}. $$I can prove it when $N=1$ (though not in a particularly elegant or enlightening way), and I have numerical evidence for its truth when $N=2.$ Can anyone prove or disprove for general $N$?
Assuming it is true, can you derive a similar upper bound for the quantity$$\frac{\sum_{i=1}^N \alpha_i \gamma_i}{\left(\sum_{i=1}^N \beta_i\right)^2}$$for positive real numbers $\{\gamma_i\}_{i=1}^N$?
Edit:To be clear, my proof of the $N=1$ case is simply to define$$f_x(y) = (x-y) \sqrt{x(1-x)} \log \frac{x(1-y)}{y(1-x)} - x\log \frac{x}{y} - (1-x) \log \frac{1-x}{1-y},$$and differentiate twice to show that $f_x(y) \le f_x(x) = 0.$ I do not think it generalizes easily.
Edit 2: I now suspect$$\require{enclose}\enclose{horizontalstrike}{\frac{\sum_{i=1}^N \alpha_i \gamma_i}{\left(\sum_{i=1}^N \beta_i\right)^2} \le \sqrt{\frac{1}{N}\sum_{i=1}^N \gamma_i^2} \left(\sum_{i=1}^N (x_i-y_i)^2\right)^{-1}}$$Is it true? Can it be tightened?This is false. Leonbloy provided a counterexample.
Edit 3: A new conjecture, which for my purposes would be the most useful to prove: let $\{p_i\}_{i=1}^N \in [0,1]$ with $\sum_i p_i =1$ and $\alpha_i, \beta_i, \gamma_i$ as above. Then$$ \frac{\sum_i p_i \alpha_i \gamma_i}{\left(\sum_i p_i \beta_i\right)^2} \le \frac{\sum_i p_i \gamma_i}{ \sum_i p_i (x_i-y_i)^2}.$$