Firstly consider the multi-index notation.
Let $\{c_\alpha\}_{\alpha\in\mathbb{N}^n}\subseteq\mathbb{R}$ and $x_0:=(x_{01},\cdots,x_{0n})\in\mathbb{R}$.
Define $\rho :=\sup\big\{r\in [0,\infty ):\sum_{\alpha\in\mathbb{N}^n}|c_\alpha |r^{|\alpha|}<\infty \big\}$.
Suppose that $\rho >0$.
Let $r\in (0,\rho )$ and define $f:B_r(x_0)\to \mathbb{R}$ by $f(x):=\sum_{\alpha\in\mathbb{N}^n}c_\alpha(x-x_0)^\alpha $ with $B_r(x_0)$ being the open ball with radius $r$ with respect to the maximum norm.
My question is: how can I show that for all $\beta\in\mathbb{N}^n$ the partial derivative $\partial ^\beta f(x)$ exists and $\partial ^\beta f(x)=\sum _{\alpha\in\mathbb{N}^n}c_{\alpha +\beta }\frac{(\alpha +\beta )!}{\alpha!}(x-x_0)^\alpha $ for all $x\in B_r(x_0)$?
Let $\beta =(\beta _1,\cdots, \beta _n)\in \mathbb{N}^n$, $i\in \{1,\cdots,n\}$, and $k\in\mathbb{N}$ be any elements.
Define $\hat\beta_i:=(\beta _1,\cdots, \beta _{i-1},0,\beta _{i+1},\,\cdots ,\beta _n)$ and $(k)_i^n:=(0,\cdots,0,k,0,\cdots,0)\in\mathbb{N}^n$ in which the $k$ shows in the $i$th coordinate.
I was able to show that $\sum _{k=0}^\infty d_k(x_i-x_{0i})^k$ is absolutely convergent on $B_r(x_0)$ and $\sum_{\alpha\in\mathbb{N}^n}c_\alpha(x-x_0)^\alpha =\sum _{k=0}^\infty d_k(x_i-x_{0i})^k$ in which $d_k:=\Sigma _{\beta \in \mathbb{N}^{n-1}}c_{\hat \beta _i+(k)^n_i}(x-x_0)^{\hat \beta _i}$ for all $k\in\mathbb{N}$.
So I tried to use induction, but I'm having problem to finish the proof.
If you know a reference that studies these kind of series in the context of real analysis, please tell its name!