Given the following matrix, find an approximation of the largest eigenvalue. $$ A = \begin{bmatrix} 3 & 2 \\ 7 & 5 \\ \end{bmatrix} $$
And I was also given $$\vec x= \begin{bmatrix} 1 \\ 0 \\ \end{bmatrix} $$
How my professor solves this is by calculating the slopes of $A\vec x = \vec b_1$, $A^2 \vec x = \vec b_2$, $A^3 \vec x = \vec b_3$ and so on until we get the slope of $\vec b_i$ converging to the same value. Then when we get the approximated $\vec b$, he plug into $A \vec b = \lambda \vec b$, and the corresponding $\lambda$ is the largest eigenvalue.
Since slope is $\frac yx$ , it works fine for $2 x 2$ matrix. But how do I apply this method for a bigger matrix?
Here's a hint: You want to determine when $\mathbf{b}_n$ is a near-scalar multiple of $\mathbf{b}_{n-1}$. In $\mathbb{R}^2$, (nonzero) vectors are scalar multiples of one another iff their slopes are equal. A possibly more useful definition is that two vectors $\mathbf{v}$ and $\mathbf{w}$ are scalar multiples of one another if and only if
$$\hat{\mathbf{v}} = \pm\hat{\mathbf{w}},$$
where
$$\hat{\mathbf{v}} = \frac{\mathbf{v}}{|\mathbf{v}|},$$
which extends more nicely to multiple dimensions.