For any probability vector $\boldsymbol{p}=(p_1,\dots,p_n)\ge 0$ with $\sum_{i=1}^np_i=1$, I guess that the following majorization relation holds
$$ -\boldsymbol{p}\log \boldsymbol{p} \prec H(\boldsymbol{p})\boldsymbol{p}$$
where $H(\boldsymbol{p})=-\sum_{i=1}^n p_i\log p_i $ denotes the Shannon entropy of probability vector $\boldsymbol{p}$.
In a part of my answer to this question, you can find a proof for the case where all probabilities are less than $e^{-1}$, where I equivalently showed that $\boldsymbol{p}$ majorizes the normalized vector $\small \frac{-\boldsymbol{p}\log \boldsymbol{p}}{H(\boldsymbol{p})}$.
I verified the above conjecture for $\color{green}{n=2}$ by finding the doubly stochastic matrix that salsifies the following equation (as an equivalent condition for the above majorization):
$$ \begin{bmatrix} -p_1\log p_1 \\ -p_2\log p_2 \end{bmatrix}= \begin{bmatrix} x & 1-x \\ 1-x & x \end{bmatrix} \begin{bmatrix} p_1\left (-p_1\log p_1-p_2\log p_2 \right) \\ p_2\left (-p_1\log p_1-p_2\log p_2 \right) \end{bmatrix} $$
where $x$ is given by ($p_2=1-p_1$):
$$x=\frac{p_1\log p_1}{\left(2p_1-1\right)\left(p_1\log x+\left(1-p_1\right)\log\left(1-p_1\right)\right)}-\frac{1-p_1}{\left(2p_1-1\right)},$$
which is always in $[0,1]$ for any $p_1 \in [0,1]$ (source-2).
Update 1:
In an answer below, I proved a related result, but I could not immediately use it to prove the conjecture.
To reach a proof or a counterexample, one should notice when all probabilities are less than $e^{-1}$, the claim holds. Thus, there two remaining cases that should be examined where one or two of the probabilities are greater than $e^{-1}$.
I also verified the conjecture for $\color{green}{n=3,4}$ by checking this condition (source-3, source -4):
$$\sum_{i=1}^n \max \left (p_i-C, 0 \right ) \ge \sum_{i=1}^n \max \left (\frac{-p_i \log p_i}{H(\boldsymbol{p})}-C, 0 \right ), C \in \mathbb R. $$