Let's say we have a 2 by 3 or a by b binary matrix. We want to determine the maximum number of matrices such that no matrix has all zeroes in one of its columns.
I started by counting the total number of matrices : 2^(axb).
Then I tried counting the number of matrices that don't make the fit.
The case in which the first column is all 0 gives the following : 2^(a)(b-1)
Then I multiplied it with the number of columns.
The formula is : 2^(a)(b-1)b
It doesn't give me the correct answer, and I think it's a problem related to permutations. How do I approach this?
Each column in an $a\times b$ matrix has $a$ entries. There are $2^a$ distinct binary columns with $a$ entries, only one of which has all entries equal to $0$. It follows that there are $2^a-1$ valid choices for any given column.
Since there are $b$ columns, the number of binary matrices whose columns are all valid is
$$(2^a-1)^b$$