Let's say we're working under the framework of classical logic, and I want to prove a statement by just using the laws of predicate logic (in which case we may define a step as an application of a single logical law), for example to prove $(A\,\texttt{AND}\,B)\,\texttt{AND}\, C$ given $A,B$ and $C$ one only needs these steps:
A. [given]
B. [given]
C. [given]
From A, B: deduce A AND B. [CONJUNCTION INTRODUCTION]
From A AND B, C: deduce (A AND B) AND C. [CONJUNCTION INTRODUCTION]
QED!
But how can one prove, given an arbitrary situation, that $n$ is the minimal number of steps required to prove something? (I'm especially interested for a practical way for these types of elementary logic questions)
Let me give an example outside the purview of classical logic, let's say you want to prove that the row operation 'interchange two rows' is equivalent to a combination of 'multiply a row by a non-zero constant' and 'add a multiple of a row to another row', then you can do so by using 4 steps, but how can I prove that this is the smallest number for which a proof is possible?