I have to solve problems in which I have to use matrix properties to prove that $1$ or $2$ $3\times3$ matrices are equal to some $3$rd matrix. However, I have no idea how to start. How can I improve this?
For examples, check this link. http://ncerthelp.blogspot.in/2013/04/1-to-4-using-property-of-determinants.html
I'm not sure what you mean by "1 or two 3*3 matrices are equivalent to some 3rd matrix", but I will try to give you some extra tools and intuition behind the problems you presented in the link.
All of these problems included in your link ask you to show that the determinant is zero. The determinant is the product of the eigenvalues, and the determinant being zero means that the matrix must have zero as an eigenvalue, which is equivalent to the matrix having a nontrivial null space (i.e. it has dimension greater than $0$). We can relate the dimension of the null space to the dimension of the column space by the Rank Theorem, i.e.:
$$\dim \mathrm{Null}(A) + \dim \mathrm{Col}(A) = \dim V$$
Where $V$ is the vector space we are working over.
So from the rank theorem, we can deduce that the determinant of $A$ is zero if and only if the null space of $A$ is nontrivial if and only if the dimension of the column space is less than the dimension of the vector space we are working over.
The dimension of the column space (of a square matrix, as this is how the determinant is defined) will be less than the dimension of the vector space if and only if the columns are linearly dependent, which means that we can write (at least) one of them as a linear combination of the other.
So from my perspective, a much more intuitive way to try to approach these problems is to consider the column space; trying to write the columns as linear combinations of each other. If you can do this, the determinant must be $0$. If you do this, the first problem is trivial, Column 3 is defined as a linear combination of Columns 1 and 2. It is pretty easy to see the linear combination in the second problem as well.
Sometimes it is not so easy to see linear dependence as in the first two problems, and then you could also use the fact that standard column operations preserve the dimension of the column space, which is how the solutions provided by the website work. By using column operations to produce a column of zeros (which can always be written as a linear combination of the other columns, by taking zero coefficients), they show that the dimension of the column space is less than the dimension of $V$, which must mean there is a determinant of zero.