Can **Kronecker product** explained operators in matrices?

183 Views Asked by At

I'm going to create matrix $A_{n\times kn}$ for each $k\in \mathbb{N}$. $$A=\underbrace{[I_{n\times n},I_{n\times n},\dots,I_{n\times n}]}_{k\text{ times}}$$ Complete Answer to my question to have a good Matlab code given in comments by @Wauzl and in math with @A.P.
But a new question come up here. Is there any logical proof to explain we can't made Kronecker product with normal and point wise product in matrices.

1

There are 1 best solutions below

0
On BEST ANSWER

I believe that the best way to build that matrix in MATLAB is with the code snippet proposed by Wauzl

A = repmat(eye(n), 1, k)

You could also try with the kron function

A = kron(ones(1,k), eye(n))

but I don't think this can be more efficient than repmat. I don't currently have access to MATLAB, though, so I can't test this.


As I mentioned in my comment above, mathematically you can easily define $$ A = \underbrace{[I_{n\times n},\dots,I_{n\times n}]}_{k \text{ times}} $$ as a Kronecker product $$ \underbrace{[1,\dotsc,1]}_{k \text{ times}} \otimes I_{n \times n}. $$


Note that there is no matrix that can be used to write $B \otimes \, -$ (for a fixed $h \times k$ matrix $B$) as a usual matrix product or, in other words, there is no matrix $C$ such that $$ B \otimes \, - = C \, - \quad \text{or} \quad B \otimes \, - = - \, C $$ This is simply because $B \otimes \, -$ is defined for every possible matrix while, say, $C \, -$ is defined only for matrices with the same number of rows as the number of columns of $C$.

"But!", you may say, "What if I restrict $B \otimes \, -$ to matrices of a given size, say $n \times n$?"

Glad you asked! It still isn't possible unless $h = 1$ or $k = 1$. Indeed, suppose that, say, $B \otimes \, - = C \, -$ for some $x \times y$ matrix $C$. Now observe that if $M$ is an $n \times n$ matrix, then the size of $B \otimes M$ is $hn \times kn$. On the other hand $CM$ (which is defined only if $y = n$) has size $x \times n$. Similarly for $B \otimes \, - = - \, C$.

So... suppose that, say, $B = [b_1,\dotsc,b_k]$ is a $1 \times k$ vector. Then it is easy to see that $B \otimes \, - = - \, C$ with $$ C = \left[ b_1 I_{n \times n} \dotsb b_k I_{n \times n} \right]. $$ On the other hand, we can't really use this to write $A$ as a product of "more elementary" matrices, because for $B = [1,\dotsc,1]$ we have $C = A$.