I have pondered this question for a while while drinking some Chai spice tea (yes the double use of the word "while" makes sense):
Suppose $x = 5678$ and $y = 1234$. Now let us label each of the parts of the numbers as follows: $$x = \underbrace{56}_{a} \underbrace{78}_{b}$$ $$y = \underbrace{12}_{c} \underbrace{34}_{d}$$
Write $x= 10^{n/2}a+b$ and $y = 10^{n/2}c+d$. Then $$xy = 10^{n}ac+10^{n/2}(ad+bc)+bd$$
Where does recursion play a role in this? For recursion to work, we would need a base case. Also is possible to represent this in matrix form, e.g.:
$$ \begin{bmatrix}a & b \\ c & d \end{bmatrix} = \begin{bmatrix} 56 & 78 \\ 12 & 34 \end{bmatrix}$$
The $ad+bc$ term reminded me of the matrix determinant for a $2 \times 2$ matrix except the terms are added.
How do we compute the terms $ac, ad, bc$ and $ad+bc$ using recursion?
Your algorithm computes the product of $x$ and $y$, both $n$-digit numbers. You divide each of $x$ and $y$ into four $n/2$-digit numbers $a, b, c, d$ and find you need to find the products $ac,ad,bc,ad+bc$. Each of these is a product of $n/2$-digit numbers, so you can use your multiplication algorithm recursively to compute these numbers.
In other words, start with two $n$-digit numbers (pad with zeros to make $n$ a power of two), then that product will require $4$ products of $n/2$-digit numbers, each of which will require $4$ products of $n/4$-digit numbers, and so on. Obviously, you'll stop the recurrence when you get down to numbers that can be computed directly, so for instance you could always stop the recursion when $n=1$. Just counting the multiplications (since the additions and shifts will take less time than the multiplications), we have the recurrence for the time this will take, $$ T(n)=\begin{cases} 1 & \text{if $n=1$}\\ 4T(n/2) & \text{if $n>1$} \end{cases} $$ Which, unfortunately, has the solution $T(n)=n^2$, which is no faster than if we'd just used naive multiplication.
The real key to this multiplication algorithm is that you actually only need to perform three products. Since $$(a+c)(b+d) = ab+ad+bc+cd $$ we can compute $$P_1=ab,\quad P_2 = cd,\quad P_3 =(a+c)(b+d) $$ and then $$ ad+bc=P_3-P_2-P-1 $$ This means that if we do this recursively we'll only need $3$ multiplcations of $n/2$-digit numbers and this will have running time about $n^{\log_2 3}\approx n^{1.585}$ which is asymptotically much better than $n^2$.
As a side note, you can do even better by splitting your numbers in groups of three, rather than 2, using seven products rather than three, though now the computational overhead starts to eat into the running time so this is only time-saving on really long numbers. There are, by the way, even faster ways to multiply.