Algebraic Manipulation for Mathematical Induction

146 Views Asked by At

I'm working on a mathematical induction problem. The question is as follows:

$P = \begin{pmatrix} 1-A & A \\ B & 1-B \\ \end{pmatrix}$

for A,B $\epsilon$ (0, 1). Show by induction, or otherwise that $P^n = \frac{1}{A+B} \begin{pmatrix} B & A \\ B & A \\ \end{pmatrix} + \frac{(1-A-B)^n}{A+B} \begin{pmatrix} A & -A \\ -B & B \\ \end{pmatrix}$

for any n $\epsilon $ $\Bbb N$.

I understand how induction is done, however I'm lost with the algebraic manipulation. So far I've proved that P(1) is true by substituting n = 1 into the equation. When it comes to proving that p(k+1) is true however, I get lost in the algebra. I know I have to show something like

$P^k = \frac{1}{A+B} \begin{pmatrix} B & A \\ B & A \\ \end{pmatrix} + \frac{(1-A-B)^k}{A+B} \begin{pmatrix} A & -A \\ -B & B \\ \end{pmatrix} = \begin{pmatrix} 1-A & A \\ B & 1-B \\ \end{pmatrix}^k$

$P^{k+1} = \frac{1}{A+B} \begin{pmatrix} B & A \\ B & A \\ \end{pmatrix} + \frac{(1-A-B)^{k+1}}{A+B} \begin{pmatrix} A & -A \\ -B & B \\ \end{pmatrix} = \begin{pmatrix} 1-A & A \\ B & 1-B \\ \end{pmatrix}^{k+1} = \begin{pmatrix} 1-A & A \\ B & 1-B \\ \end{pmatrix}P^k = \begin{pmatrix} 1-A & A \\ B & 1-B \\ \end{pmatrix}[\frac{1}{A+B}\begin{pmatrix} B & A \\ B & A \\ \end{pmatrix} + \frac{(1-A-B)^k}{A+B} \begin{pmatrix} A & -A \\ -B & B \\ \end{pmatrix}]$

Any help would be greatly appreciated

1

There are 1 best solutions below

0
On BEST ANSWER

Show us what you get if you carry out the computation on the right hand side of your last equation: Distribute the matrix multiplication over the matrix addition, and perform the matrix multiplication. You can take the scalar multiples outside during this, and consider them afterwards.