The Kolmogorov-Arnold neural networks (KAN), Ziming Liu et al, KAN: Kolmogorov-Arnold Networks draws inspiration from the Kolmogorov-Arnold representation theorem(KA theorem). However, the former, as formulated in Theorem 2.1 in that paper, is composition of one-dimensional smooth functions. It significantly diverges from and falls below the theorem's original intent and content. It confines its form to compositions of sums of single-variable smooth functions, representing only a tiny subset of all possible smooth functions. This confinement eliminates, by design, the so-called curse of dimensionality. It is seriously doubtful that this subset is dense within the entire set of smooth functions --- though I have not come up with an example yet. If it is indeed not dense, KAN will not serve as a universal function approximator, unlike the multilayer perceptron.
What is an example of a smooth (continuous or even differentiable to some finite order) function that cannot be approximated by $\Phi_{L-1}\circ\Phi_{L-2}\cdots\circ\Phi_1\circ\Phi_0\, \mathbf x, L\in \mathbf N$ where $\Phi_i$ is the one-dimensional smooth function as characterized by Theorem 2.1 of the KAN paper?