I need the algorithm to finding only the linearly independent rows in a binary matrix using XOR function.
Example 1:

The result:

Example 2:

The result:

R4 is not included because:

I need the algorithm to finding only the linearly independent rows in a binary matrix using XOR function.
Example 1:

The result:

Example 2:

The result:

R4 is not included because:

Copyright © 2021 JogjaFile Inc.
If you do row reduction on $A^T$ (transpose of $A$), the columns with (row) leading $1$s will give you a maximal set of linearly independent columns of $A^T$. The corresponding rows of $A$ would give you a maximal set of linearly independent rows of $A$. (The reason for the transposing is that row reduction preserves column dependencies, but not row dependencies).