Let $A \succeq B$ denote matrix $A-B$ is positive semidefinite, and here is the definition of redundant(all the matrix dimensions are $N\times N$ ):
Given a set of matrix $\{B_i\}_{i=1}^{l}$, if there exist nonnegative constants $\{\alpha_i\}_{i=1}^{l}$ such that $$\sum_{i=1}^{l} \alpha_i= 1, \text{ and } A \succeq \sum_{i=1}^{l}\alpha_i B_i$$ then we say the A is redundant.
Here is the question: how to determine whether an arbitrary $A$ is redundant given the set of matrix $\{B_i\}_{i=1}^{l}$ ?
Here is my incomplete solution:
Since the "positive semidefinite" problem is hard to handle, I suppose to solve "positive definite" problem:
$$\sum_{i=1}^{l} \alpha= 1, \text{ and } A \succ \sum_{i=1}^{l}\alpha_i B_i$$
Then, I use this characterization: Its leading principal minors are all positive. Assume the leading principal minors of $A$ are $\{a_j\}_{j=1}^{N}$, $B_i$ are $\{b_{ij}\}_{j=1}^{N}$, then we have
$$a_j > \sum_{i=1}^{l} \alpha_i b_{ij}, \forall j$$
So the question could be changed to an optimization problem:
$$\text{minimize} \sum_{i=1}^{l}\alpha_i$$ $$\text{subject to }a_j \gt \sum_{i=1}^{l} \alpha_i b_{ij}, \forall j \text{ and } \alpha_i \geq 0, \forall i$$
If the solution shows that $\sum_{i=1}^{l}\alpha_i$ is less than 1, then $A$ is redundant.
Sketch of the idea: Without any relaxions, etc., your problem can be conveniently rewritten as \begin{eqnarray} \underset{\alpha \in \mathbb{R}^l}{\text{maximize }}c^T\alpha \text{ subject to } F(\alpha) \succeq 0, \alpha \ge 0, \end{eqnarray} where $F(\alpha) := A - \sum_{1 \le i \le l}\alpha_iB_i \in \mathbb{R}^{N \times N}$, and $c \in \mathbb{R}^{l \times 1}$ is a column vector of $1$'s.
We immediately recognize the above problem is a LCP (linear convex program) with positive-semi-definiteness constraints. Such problems have been studied extensively in the paper.
If the maximal value $c^T\alpha^* \ge 1$, then $A$ is "redundant".