I have a dataset with $m$ feature vectors, $x_i, ~~i=1, 2, ..., m$, each of which is assigned to a partial label vector, $y_i\in\{0, 1\}^{l\times 1}, ~~i=1, 2, ..., m$. If the label $j$ is a candidate label for the sample $x_i$, then $y_{ij}=1$; otherwise $y_{ij}=0$. So, a typical partial label vector would contain both $1$s and zeros, like this one (for $m=5$): $y_i=[0, 1, 1, 0]^T$.
Suppose we define $R=[r_{ij}]\in\mathbb{R}^{m\times m}$, where:
$r_{ij}= \begin{cases} 1 &~~ \text{$y_i^Ty_j=0$} \\ 0 &~~ \text{$y_i^Ty_j\neq0$} \\ \end{cases}$
and $L=\operatorname{diag}(R\mathbf{1}_m)-R$, where $\mathbf{1}_m$ is a $m$-dimensional vector having $1$ in every element.
Is $L$ is a positive semi-definite (PSD) matrix?
As I couldn't prove it analytically, I generated a lot of such matrices in Matlab and none of them had negative eigenvalues, but I'm not quite sure whether it is always PSD.
Your $L$ by definition is a (weakly) diagonally dominant symmetric matrix with a nonnegative diagonal. Therefore, by Gerschgorin disc theorem, all its eigenvalues are nonnegative. Hence it is positive semidefinite.