Let $f = (f_1, \ldots, f_n) :\mathbb R^m \to \mathbb R^n$. Let $(e_i)$ and $(e'_j)$ be the standard bases of $\mathbb R^m$ and $\mathbb R^n$ respectively. Let $\partial f (x)$ be the Fréchet derivative of $f$ at $x$, and $\partial f (x)[e_i]$ its value at $e_i$. Let $\partial_i f (x)$ be the partial derivative of $f$ at $x$ w.r.t. $e_i$.
Could you verify if my understanding is fine?
We have $$ \begin{align} \partial f (x)[e_i] &=\partial_i f (x) \\ &= \partial_i \bigg (\sum_j f_je'_j \bigg) (x) \\ &= \sum_j \partial_i (f_je'_j ) (x)\\ &= \sum_j \partial_i (e'_j f_j) (x). \end{align} $$
We consider $e_j'$ as a function, i.e., $e_j':\mathbb R \to \mathbb R^n, t \mapsto t e_j'$. Then $e_j'$ is linear continuous and thus $\partial_i e_j' (r)[h]= e_j' (h) = h e'_j$ for all $r,h \in \mathbb R$. By chain rule, we have $$ \begin{align} \sum_j \partial_i (e'_j f_j) (x) &= \sum_j \partial_i e'_j (f_j (x)) \circ \partial_i f_j (x) \\ &= \sum_j \partial_i e'_j (f_j (x))[ \partial_i f_j (x) ]\\ &= \sum_j \partial_i f_j (x) e'_j. \end{align} $$
Unfortunately no. It seems that you want to prove $$\partial f (x)[e_i] = \partial_i f_j (x) e_j . \tag{1}$$
This cannot be correct. Without going into the details of your calculations it is obvious that the LHS of $(1)$ depends on $f$ (i.e. on all component functions $f_j$) and the RHS depends only on a single component function $f_j$. In fact, if $(1)$ were correct, all $d_j =\partial_i f_j (x) e_j \in \mathbb R^m$ would agree. But all coordinates of $d_j$ except possibly the $j$-th coordinate are $0$ and this would imply that all $d_j = 0$.
Let us have a closer look at your arguments.
To avoid confusion at which place $e'_j$ has which interpretation, I would not identify $e'_j$ with a function $\mathbb R \to \mathbb R^n$, but write e.g. $\bar e'_j : \mathbb R \to \mathbb R^n$. This gives
$$\sum_j \partial_i (f_je'_j ) (x) = \sum_j \partial_i (\bar e'_j f_j) (x) . \tag{2}$$ On the LHS we have the product $f_j \cdot e'_j$, on the RHS the composition $\bar e_j \circ f_j$. So, by the way, it would be easier to use from the beginning that $f = \sum_j \bar e'_j \circ f_j$ (which is obvious).
We have $\partial \bar e'_j(r) = \bar e'_j$ for all $r$ since $\bar e'_j$ is linear. But the expression $\partial_i e_j' (r)$ does not make sense because there are no partial derivatives for a function of a single variable. Thus the equations in the last line of your question do not make sense. You can of course apply the chain rule to calculate $\partial_i (\bar e'_j \circ f_j) (x)$, but this gives nothing else than $$\partial_i (\bar e'_j \circ f_j) (x) = \bar e'_j(\partial_i f_j (x)) = \partial_i f_j (x) \cdot e_j.$$ This produces $$\sum_j \partial_i (\bar e'_j f_j) (x) = \sum_j \partial_i f_j (x) \cdot e_j= \partial_i f(x)$$ which is the equation you started.
Update:
The above calculations rely on the following facts:
Let $\phi : U\to V$ be a function between open subsets $U \subset \mathbb R^m, V \subset \mathbb R^n$.
If $m = 1$ and $\phi$ is differentiable at $x \in U$, then $\phi'(x) = d\phi(x)(1)$.
If $\phi$ is differentiable at $x \in U$, then all partial derivatives $\partial_v \phi (x)$ with $v \in \mathbb R^m$ exist and we have $\partial_v \phi (x) = df(x)(v)$.
The proof of 2. consists in applying the chain rule to $\phi \circ l_{x,v} : (-\epsilon,\epsilon)\to V$, where $l_{x,v}(t) = x + tv$ and $\epsilon$ is sufficiently small so that $l_{x,v}((-\epsilon,\epsilon)) \subset U$ and then using 1.
We can now prove the chain rule for partial derivatives: