I am new to approximation theory and do not know much about Sobolev spaces.
I aim to approximate a Sobolev function $f$ using linear combinations of ridge functions of the form $g(\mathbf{a}.\mathbf{x})$. I was wondering whether I can have the class of $f$ to be sufficiently smooth such that the approximation error (due to approximation by $g$) can be made exponentially small. Generally, I have encountered polynomial rates, but I was wondering if I can get exponential rates as well. To be specific, can I have a result similar to this assuming analyticity of $f$?Suppose $f \in C^\omega(\Omega)$, i.e., $f$ is real-analytic on the domain $\Omega \subset \mathbb{R}^d$. Suppose it is approximated by ridge functions of the form $g(\mathbf{a} \cdot \mathbf{x})$, where $g$ is smooth, the approximation error $E_N(f)$ (the error of the best approximation by a linear combination of $ N $ ridge functions) may decay exponentially as:$$E_N(f) \leq C e^{-\alpha N^\beta}$$for some constants $ C > 0 , \alpha > 0 $, and $ \beta > 0 $.
I mean that the error decreases at an exponential rate as $N$, the number of ridge functions, increases. I might be wrong in my claim, but do you know about results of a similar flavour?
Thanks!