Let $A$ be a commutative ring, $M$ an $A$-module, and $N_\alpha\subset M$ a family of submodules. Consider the intersection $$\bigcap_\alpha N_\alpha.$$ We say that the intersection is irredundant if $\bigcap_{\beta\neq\alpha}N_\beta\not\subset N_\alpha$ for all $\alpha$.
When the family is finite, one can always make the intersection irredundant by "throwing away" redundant submodules. But can one also do this for infinite intersections? It is conceivably possible throwing away infinitely many $N_\alpha$ and yet never reaching an irredundant intersection. Is there a way to obtain an irredundant intersection using a Zorn-type argument? Or is there a reasonably innocuous assumption, such as Noetherian-type of condition, which makes it possible?
Consider an infinite, strictly decreasing chain of (nonzero) submodules. If one begins to ask "which ones are redundant," then you inevitably come to the answer "all of them!" For all chains to have a minimum element, you need the module to br Artinian at least.
On the other hand, if you assume the Artinian condition holds, you can conclude that any intersection of infinitely many submodules stabilizes after finitely many steps, and then weed out any more redundancy from that initial finite set by hand.
So you see, the descending chain condition is what you seek.