Let $M$ be a square matrix of full rank $n$. I'm wondering if there exists a permutation $\sigma$ such that the $\left(i,\sigma(i)\right)$-th minor of $M$ and the $\left(i,\sigma(i)\right)$-th entry of $M$ are both nonzero for each $i=1,2,\dots,n$.
My observations: For the first row, we can find an index $j$ such that the $\left(1,j\right)$-th minor of $M$ and the $\left(1,j\right)$-th entry of $M$ are both nonzero, which is deduced from the Laplace expansion and $\operatorname{det}(M)\neq0$. Besides, the statement can work for the case of $n=2$. Now I have no idea about how to prove or disprove the whole statement. Any ideas would be appreciated so much. Thank you!
This claim would be false due to the following counterexample:
$$ \begin{bmatrix} 1 & 0 & 1\\ 1 & 1 & 1\\ 0 & 1 & 1 \end{bmatrix}. $$
For the $1$st row, we have to choose the $3$rd column, since the $(1,1)$-th minor and the $(1,2)$-th entry are $0$. However, for the $3$rd row, we still have to choose the $3$rd column since the $(3,1)$-th entry and the $(3,2)$-th minor are $0$.
Thus in the case, there doesn't exist a permutation $\sigma$ such that the $(i,\sigma(i))$-th minor and $(i,\sigma(i))$-th entry are both nonzero for each $i=1,2,3$.