$B = (v_1 , v_2,\ldots, v_n)$ form a basis of a general vector space. Given $P$, an invertible matrix, prove that BP is a basis.
Here's my thought - we know that an invertible matrix is a product of multiplying elementary matrices. Meaning, for each $v_i$, multiplying it with $P$ will result in some elementary actions done to the vector. These actions are simply linear combinations of them - so, BP will be a basis with vectors that are all linear combinations of vectors in $B$, meaning, they will still span and be linear independent.
Is this direction correct? I'll formalize it, ofcourse, but wanted opinions. Thanks!
HINT: Consider a linear combination of $BP$ that is zero:
$$\sum_{k=1}^n \lambda_kPv_k = 0.$$
Because $P$ is invertible, you may apply $P^{-1}$ to both sides. Do you think you can finish?