Let $\mathcal C$ be a (linear) $[n, k]_2$ code, $(A_i)$ the weight distributions for $\mathcal C$ and $(A_i')$ the weight distributions for $\mathcal C^\bot$. I now want to prove that:
$$\displaystyle \sum_{i=0}^n \pmatrix{i \\ r} \frac{A_i}{2^k} = \frac 1{2^r} \sum_{i=0}^r (-1)^i \pmatrix{n - i \\ r - i} A_i' \tag{1}$$
for $r = 0, 1, \dots, n$, and that
$$\displaystyle \sum_{i=0}^n \pmatrix{i \\ r} \frac{A_i}{2^k} = \frac{1}{2^r} \pmatrix{n \\ r} \tag{2} $$
in case that $r = 0, 1, \dots, d(\mathcal C^\bot) - 1$.
Now my first idea was: I already know the MacWilliams identity, and I can also use (without proof) the fact that
$$ A_k' = \frac{1}{|\mathcal C|} \sum_{i=0}^n \sum_{j=0}^k (-1)^j (q - 1)^{k - j} \pmatrix{i \\ j} \pmatrix{n - i \\ k - j} A_i \tag{3} $$
for $k = 0, \dots, n$, which already "looks similar" to the formulas I want to show. So what I tried next was writing down the sum over all the $A_k'$ using the formula $(3)$, but that looked quite messy with the triple sum that was now part of the expression, and I couldn't spot an obvious way to simplify it (and I also don't know where exactly this parameter $r$ would come into play). So I'm not really sure if that is the right path (and if so, how I can continue), or if the desired formulas can be derived from something else.
Posting the solution here that I now got to, in case anyone is faced with the same problem.
Regarding $(1)$. For a $[n, k]$-Code, we have $|\mathcal{C}| = 2^k, |\mathcal{C}^\bot| = 2^{n-k}$ (where $\mathcal{C}^\bot$ denotes the dual code). Now MacWilliams gives us:
$W_\mathcal{C}(X, Y) = \frac 1{|\mathcal{C}^\bot|} W_{\mathcal{C}^\bot} (X + Y, X - Y)$
Setting $ X = 1$, we get:
$W_\mathcal{C}(1, Y) = \frac 1{|\mathcal{C}^\bot|} W_{\mathcal{C}^\bot} (1 + Y, 1 - Y)$
$=> \sum_{i=0}^n A_i Y^i = \frac 1{2^{n - k}} \sum_{i=0}^n A_i' (1 + Y)^{n - i} (1 - Y)^{i} \tag{*} $
Differentiating the RHS of $(*) \space r$ times leads us to:
$\frac 1{2^{n-k}} A_i' \left(\sum_{i=0}^n (1 + Y)^{n - i} (1 -Y)^i\right)^{(r)} $
$ = \frac 1{2^{n-k}} A_i' (\sum_{i=0}^n \left((1 + Y)^{n - i} (1 -Y)^i\right)^{(r)} $
Applying the Leibniz-Rule, we get:
$ = \frac 1{2^{n-k}} A_i' \left( \sum_{l=0}^r \pmatrix{r \\ l} (1 + Y)^{n - i} \right)^{(l)} ((1 - Y)^i)^{(r - l)} $
$ = \frac 1{2^{n-k}} A_i' \sum_{l=0}^r \pmatrix{r \\ l} \frac{(n - i)!}{(n - i - l)!} (1 + Y)^{n - i - l} \frac{i!}{(i - r + l)!} (1 - Y^{i - r + l} \cdot (-1)^{r - l} $
By setting $Y = 1$ and $i := r - l, we get:
$ = \frac 1{2^{n-k}} A_i' \sum_{l=0}^r \pmatrix{r \\ r - i} \frac{(n - i)!}{(n - i - r + i)!} (1 + Y)^{n - i - r + i} \frac{i!}{(i - r + r - i)!} (1 - Y)^{i - r + r - i} $
which simplifies to (skipping a few easy-to-see but tedious-to-scribble-down steps here):
$ = \frac{2^k}{2^r} \sum_{i=0}^r A_i' (-1)^i \pmatrix{n - i \\ r - i} r! \tag{4} $
which already "looks good", compared to the formula $(1)$ that we want to reach. Now if we look again at $(*)$ and differentiate the LHS aswell $r$ times, we get:
$W_\mathcal{C}^{(r)}(1, Y) = \left(\sum_{i=0}^n A_i Y^i\right)^{(r)} = \sum_{i=0}^n A_i \frac{i!}{(i-r)!} Y^{i - r} $
Setting $Y = 1$, we get:
$= \sum_{i=0}^n A_i \frac{i!}{(i - r)!} \tag{5}$
Now we differentiated both sides of $(*) \space r$ times, and set $Y = 1$ on both, hence $(4)$ and $(5)$ are equal. Therefore we get:
$\sum_{i=0}^ nA_i \frac{i!}{(i - r)!} = \frac{2^k}{2^r} \sum_{i=0}^n A_i' (-1)^i \pmatrix{n - i \\ r - i} r! $
$<=> \sum_{i=0}^n \frac{i!}{r! (i - r)!} A_i \frac{1}{2^k} = \frac 1{2^r} \sum_{i=0}^r (-1)^i \pmatrix{n - i \\ r - i} A_i'$
$<=> \sum_{i=0}^n \pmatrix{i \\ r} \frac{A_i}{2^k} = \frac 1{2^r} \sum_{i=0}^r (-1)^i \pmatrix{n - i \\ r - i} A_i'$
which is exactly the first equation we wanted to show.
For $(2)$, let's take a look at the sum that appears in the RHS of $(1)$. Now for $0 ≤ i ≤ d(\mathcal C^\bot) - 1$, the $A_i'$ are equal to $0$, and $A_0 = 1$. So if we restrict $r ≤ d(\mathcal{C}^\bot)$, then the RHS of $(1)$ simplifies to:
$ \frac 1{2^r} (-1)^0 \pmatrix{n - 0 \\ r - 0} A_0' = \frac 1{2^r} \pmatrix{n \\ r}$
Hence we get $(2)$.