Let $f(x) = 0$ be an equation of degree $n$. WLOG we can assume that the its coefficients are in $(-1,1)$. This is because we can divide each coefficient by the coefficient with the largest magnitude to make each one of them fall in the interval $(-1,1)$.
Assume that the coefficients are uniformly random in $(-1,1)$. It is well known that most of absolute values of the roots have value close to one i.e. the roots tend to form a unit circle around the origin as shown in the figure below for a polynomial of degree $n = 666$.
It can be observed from the graph that while most of the roots are close to the unit circle, some of them are outside it. I want to find the probability $P(n,x)$that a roots of a polynomial of degree $n$ lies at a distance $x > 1$ from the origin. To do this, I run a simulation generating polynomial of degree $n$ and its $n$ roots in each trial and counted the total number of roots across all the trials whose absolute value is $\ge x$.
Question: Experimental data shows that $P(n,1) = \frac{1}{2}$ and for $x > 1$, $\displaystyle n P(n,x) \to \frac{1}{2x}$ as $n \to> \infty$. Can this be proved or disproved?
Related: Is the root of a polynomial with the largest modulus more likely to be real?