Compare the number of multiplications used by both algorithms below to evaluate $a^{2^n}$,
And the second algorithm to evaluate $a^{2^n}$,
The first algorithm will yield $2^n$ multiplications while the second one will yield only $n$ multiplications.
Question 1: I am trying to understand why the first algorithm does $2^n$ multiplications while the second one does $n$ multiplications. In my attempt to find out the number of multiplications, I find that the first algorithm does the following multiplications,
$$n = 1\longrightarrow (a\cdot 1),~~1 ~~ multiplication$$ $$n = 2\longrightarrow a\cdot (a\cdot 1),~~ 2 ~~ multiplication$$ $$\cdots$$ $$n = k\longrightarrow a\cdots(a\cdot 1),~~ k ~~ multiplication$$
So how the first algorithm does the $2^n$?
Question 2: similarly, for the second algorithm, $$n = 1\longrightarrow (a\cdot a),~~1 ~~ multiplication$$ $$n = 2\longrightarrow (a\cdot a)^2=(a\cdot a)(a\cdot a),~~ 3 ~~ multiplication$$ $$n = 3\longrightarrow ((a\cdot a)(a\cdot a))^2=(a\cdot a)(a\cdot a)(a\cdot a)(a\cdot a),~~ 7 ~~ multiplication$$ $$\cdots$$
So I am not sure why the second algorithm only does $n$ multiplications?


The first algorithm is essentially $$\underbrace{a \cdot \ldots \cdot a}_{2^n \text{ times}},$$ so you do $2^n$ multiplications (multiplying with $a$ every time). The second algorithm does $$(\ldots((a\overbrace{^2)^2)\ldots)^2}^{n \text{ times}},$$ so you are multiplying the result with itself $n$ times (not with $a$). That's the difference.