Descending chain condition on a finite dimension algebra

165 Views Asked by At

In a proof I'm reading, the author says "As $A$ is finite dimensional, a descending chain of left ideals must stabilize."

The context is that $A$ is a finite dimensional simple $k$-algebra i.e. it contains no non-trivial two sided ideals.

Could somebody explain why being finite dimensional means that $A$ contains a minimal left ideal? I feel like I'm missing something obvious.

Thanks for any replies.