I wanted to prove by induction that all sylvester Matrices defined below have orthogonal columns, which seems trivial to see but I wanted a rigorous proof.
$H_0$ = [1] , $H_{i+1}$ = \begin{matrix} H_i & H_i\\ H_i & -H_i \end{matrix}
I first defined the basis step by doing the dot product of $[1,1]^T$ and $[1,-1]^T$ which is obviously 0 which means the columns are orthogonal and the base case holds.
For the Induction Hypothesis I assumed that the hypothesis hold for $H_{i+1}$
and for the induction step I simply replaced in $H_{i+2}$ recursively $H_{i+1}$ and $H_i$ giving me the following : \begin{matrix} H_i & H_i & H_i & H_i\\ H_i & -H_i & H_i & -H_i\\ H_i & H_i & -H_i & -H_i\\ H_i & -H_i & -H_i & H_i \end{matrix} And I concluded that because each column dot product to another is null , this proved that it holds for every i. But is it rigorous enough ? I felt like I did not correctly used the IH and would love your opinion about it. Thank you for reading, and have a great day.
Base case: $H_0$ has orthogonal columns.
Induction suppose hypothesis: $H_i$ has orthogonal columns, $H_i^TH_i=D_i$ for some diagonal matrix.
Now to verify if $H_{i+1}$ has orthogonal columns,.
\begin{align} H_{i+1}^TH_{i+1}&= \begin{bmatrix} H_i & H_i \\ H_i & -H_i \end{bmatrix}^T\begin{bmatrix} H_i & H_i \\ H_i & -H_i \end{bmatrix}\\ &=\begin{bmatrix} H_i^T & H_i^T \\ H_i^T & -H_i^T \end{bmatrix}\begin{bmatrix} H_i & H_i \\ H_i & -H_i \end{bmatrix}\\ &= \begin{bmatrix} 2H_i^TH_i &0 \\ 0 & 2H_i^TH_i\end{bmatrix} \\ &= 2\begin{bmatrix} D_i & 0 \\ 0 & D_i\end{bmatrix} \end{align}
Hence $H_{i+1}$ has orthogonal columns.
Remark about your attempt:
It is unclear to me why do you do two steps recursion.