Let's define $f(x)$ as number of sequences with sum $x$ and gcd (greatest common divisor) $1$, and $g(x)$ as number of sequences with sum $x$. I know that $g(x) = 2^{x-1}$, but on one place I saw the relation $$g(x) = \sum_{y} f(y), \text{ y is each divisor of x}$$
How to understand this relation, because it is very strange to me.
Take any given sequence that is counted in $g(x)$. It will have some $\gcd\ k$. You can divide each term by $k$ to get a sequence that has $\gcd 1$. That sequence will sum to $\frac xk$.
You just categorize all the sequences in $g(x)$ by their $\gcd$'s. Each category will be counted by one of the $f(y)$'s for $y$ some divisor of $x$.