Assume I have set of matrix-vector equations that look like the following:
$$\begin{bmatrix} x_{i,w} \\ y_{i,w} \\ z_{i,w} \end{bmatrix} = a \left(\mathbf{^{w}T_{c}}\right)^{-1}\left(\mathbf{K}\right)^{-1} \begin{bmatrix} u_{i} \\ v_{i} \\ 1 \end{bmatrix}$$
where $0 < i \leq n$, $a$ is a positive constant, $\mathbf{^{w}T_{c}}^{-1}$ is a $4 \times 4$ matrix, and $\mathbf{K}$ is a $4 \times 3$ matrix.
For each vector $\begin{bmatrix} x_{i,w} & y_{i,w} & z_{i,w} \end{bmatrix}^\top$, I have a scalar valued function $f_i$:
$$ f_i\left(\begin{bmatrix} x_{i,w} \\ y_{i,w} \\ z_{i,w} \end{bmatrix}\right) = s_i$$
The sum of all scalar valued functions is a scalar
$$ S = \sum_{i = 1}^{n}f_i $$
I would like to compute each of the scalar derivatives $\frac{\partial s_i}{\partial \mathbf{^{w}T_{c}}^{-1}}$ with respect to the matrix $\mathbf{^{w}T_{c}}^{-1}$ as well as the total derivative $\frac{S}{\partial \mathbf{^{w}T_{c}}^{-1}}$.
I think this requires the use of the chain rule, and I think that the derivative should have the same shape of $\mathbf{^{w} T_{c}}^{-1}$ but I am not sure.
$ \def\a{\alpha} \def\BR#1{\Big(#1\Big)} \def\LR#1{\left(#1\right)} \def\op#1{\operatorname{#1}} \def\trace#1{\op{Tr}\LR{#1}} \def\frob#1{\left\| #1 \right\|_F} \def\qiq{\quad\implies\quad} \def\p{\partial} \def\grad#1#2{\frac{\p #1}{\p #2}} \def\c#1{\color{red}{#1}} \def\CLR#1{\c{\LR{#1}}} \def\gradLR#1#2{\LR{\grad{#1}{#2}}} \def\m#1{\left[\begin{array}{c}#1\end{array}\right]} $Here is a partial answer.
For ease of typing, I'll drop your elaborate super/subscripts and define the variables $$\eqalign{ Y = {\bf{^{w}T_{c}}}^{-1} \qquad \a = a \qquad b = K^{-1}\m{u_i\\v_i\\{\tt1}} \qquad c = \m{x_{i,w}\\y_{i,w}\\z_{i,w}} = \a\,Yb \\ }$$ Concentrate on a single component function $\BR{\phi(c) = f_i(x_i)}.\,$ Calculate its differential, change the independent variable from $c\to Y,\;$ then recover the desired gradient $$\eqalign{ \p \phi &= \gradLR{\phi}{c}:\p c \\ &= \gradLR{\phi}{c}:\LR{\a\;\p Y\,b} \\ &= \a\gradLR{\phi}{c}b^T:\p Y \\ \c{\grad{\phi}{Y}} &\c{= \a\gradLR{\phi}{c}b^T} \\ }$$ where $(:)$ denotes the Frobenius product, which is a concise notation for the trace $$\eqalign{ A:B &= \sum_{i=1}^m\sum_{j=1}^n A_{ij}B_{ij} \;=\; \trace{A^TB} \\ A:A &= \frob{A}^2 \qquad \{ {\rm Frobenius\;norm} \}\\ }$$