Euler's Theorem for Homogeneous Functions states that for homogeneous functions of order k:
$$f\left ( \bar{x} \right )=\frac{1}{k}\sum_i x_i\frac{\partial f\left ( \bar{x} \right ) }{\partial x_i} $$
$f\left ( \bar{x} \right )$ could also be decomposed using Taylor's Series expansion to give
$$f\left ( \bar{x} \right )=f\left ( \bar{0} \right ) + \sum_i x_i \frac{\partial f\left ( \bar{x} \right ) }{\partial x_i} + \frac{1}{2}\sum_i\sum_jx_ix_j \frac{\partial^2 f\left ( \bar{x} \right )}{\partial x_i\partial x_j} + ... $$
If we limit ourselves to homogeneous functions of order $1$, and $f\left ( \bar{0} \right )=0$, then the two equations above seem to imply that all the higher order terms of such a function would sum up to $0$. Is there any intuition for this?
You don't always have a Taylor expansion for a function. For example you need to assume smoothness, such that all derivatives exist.
However, if your homogeneous function is smooth, it is already a polynomial. (see this question)
In the case of order $1$, just note that $f(ax)=af(x)$ implies $\frac {d}{dx_i}f(x)=a\frac d {dx_i}f(x)$, so $\frac d{dx_i}f$ is constant along each line $l_y = \{ty|~ t> 0\}$, $y \in \mathbb R^n$.
If we have that $\frac d {dx_i} f$ is continous, then we can extend the value $\frac d {dx_i} f$ takes on these lines to the origin. Thus $\frac d {dx_i} f$ is constant and all its derivatives vanish.