Is there an alternative way to represent the $\operatorname{diag}$ function?

2.8k Views Asked by At

In optimization, it is common to see the so called $\operatorname{diag}$ function

Given a vector $x \in \mathbb{R}^n$, $\operatorname{diag}(x)$ = $n \times n$ diagonal matrix with components of $x$ on the diagonal

For example:

Optimization that involves inverse operation.

Reformulation of BQP to SDP

The reason of using $\operatorname{diag}$ is because it is used in several platforms such as MATLAB, and people generally understands what the function is supposed to do

Is there a more linear algebra, step by step way of converting a vector $x \in \mathbb{R}^n$ into a diagonal matrix with components on the diagonal without having a define a function that directly performs the task ?

i.e. given $x$, we find a series of functions/steps $f_2 \circ f_1 (x)$ which give us the same matrix as $\operatorname{diag}(x)$

4

There are 4 best solutions below

1
On BEST ANSWER

Using tensor or Kronecker product notation, if $e_i = \begin{pmatrix} 0 \\ \vdots \\ 1 \\ \vdots \\ 0 \end{pmatrix}$ is the standard basis for $\mathbb{R}^{n \times 1}$ and $e^i = (0, \dots, 1, \dots, 0)$ is the standard basis for $\mathbb{R}^{1 \times n}$ then we can represent $\operatorname{diag}(x_1, \dots, x_n)$ as

$$ \operatorname{diag}(x_1, \dots, x_n) = \sum_{i=1}^n x_i (e^i \otimes e_i). $$

This is of course the same as writing

$$ \operatorname{diag}(x_1, \dots, x_n) = \sum_{i=1}^n x_i e_{ii} $$

where $(e_ij)_{i,j=1}^n$ is the basis of $M_n(\mathbb{R}) = \mathbb{R}^{n \times n}$ consisting of matrices $e_{ij}$ which have $1$ at the $i$-th row and $j$-th column and $0$ at all other places.

0
On

The function $\operatorname{diag}$ is a linear operator, so you know there is some matrix that does what you're looking for.

Recall that to find a transformation matrix for a l.o. $T$ when working with the standard basis $e_1, e_2, \ldots, e_n$, we use the matrix:

$$\begin{bmatrix}{T(e_1) \ \vert \ \ T(e_2) \ \vert \ \ldots \ \vert \ T(e_n) }\end{bmatrix}$$

We can do the same thing here, though the matrix will have matrices as columns. Multiplying this by a vector on the right will result in a vector with one column, but that column will be a matrix.

I could go through an example, but the matrix won't be particularly interesting.

Note that this is the effectively the same answer as Alex_Wertheim's comment.

0
On

Let $\delta_{ijk}$ be the tensor such that

$$\delta_{ijk}=\begin{cases}1&\text{if}\;i=j=k\\ 0&\text{otherwise}\\ \end{cases}$$

(in the particular basis you're working in).

Then

$$\mathrm{diag}(x)_{ij}=\sum_k\delta_{ijk}x_k$$

or in the summation convention just $\delta_{ijk}x_k$.

0
On

There is a closed form.

$\mbox{diag}(x)=I_n\circ (xu)$ where $\circ$ is the Hadamard product, $I_n$ the identity matrix and $u=[1,\cdots,1]$.