Background
Consider the linear regression model:
$$y = X\beta + e\\E[e] = 0 \quad E[ee^T] = V$$
It is well known that the minimum variance affine unbiased estimator (MVAUE) of $\beta$ exists if and only if $X$ has linearly independent columns. In this case, the MVAUE is unique and given by
$$\hat\beta = My = \arg \min_\beta \, (y - X\beta)^T V_0^+ (y - X\beta)$$
where
$$ \begin{align} M &:= (X^T V_0^+ X)^{-1} X^T V_0^+ \\ V_0 &:= V + XUX^T \end{align} $$
and $U$ is any positive semidefinite matrix such that $\mathrm{col}\, X \subseteq \mathrm{col}\, V_0$, where $V_0 := V + XUX^T$. The superscript "+" denotes Moore-Penrose inverse.
Chapter 4 (in particular, section i) of C. R. Rao's "Linear Statistical Inference and it's Applications" and Chapter 13 of Magnus and Neudecker's "Matrix Differential Calculus with Applications in Statistics and Econometrics" are two good references on the subject, for the curious.
My question
I want to demonstrate that the matrix
$$M = [X^T (V + XUX^T)^+ X]^{-1} X^T (V + XUX^T)^+$$
is independent of the particular choice of $U$, provided that $X$ has linearly independent columns and that $U$ satisfies the aforementioned conditions (though I would be happy with a solution that strengthened the assumption on $U$ to $U > 0$).
I have solved the problem for two subcases, $V = 0$ and $V > 0$, and I offer these solutions below. A proof for the case where $V \geq 0$ is singular, but nonzero, has been elusive.
Partial solution: Assume $V = 0$ and $U > 0$
We will show that $M = X^+$. First note that $MX = I$. Therefore $XMX = X$ and $MXM = M$. Now observe
$$ \begin{align} X^T(XUX^T)^+X &= (U^{-1}X^+XU) X^T(XUX^T)^+X (U X^T X^{+T} U^{-1}) \\ &= (U^{-1}X^+) (XU X^T) (XUX^T)^+ (X U X^T) (X^{+T} U^{-1}) \\ &= (U^{-1}X^+) (XU X^T) (X^{+T} U^{-1}) \\ &= U^{-1} U U^{-1} \\ &= U^{-1} \end{align} $$
Therefore
$$ \begin{align} XM &= X [X^T (XUX^T)^+ X]^{-1} X^T (XUX^T)^+ \\ &= XUX^T (XUX^T)^+ \end{align} $$
is symmetric, due to the properties of $(XUX^T)^+$. We have demonstrated that $M$ satisfies all four conditions required to be the Moore-Penrose inverse of $X$.
Partial solution: Assume $V > 0$ and $U > 0$
Applying the Woodbury matrix identity gives
$$ \begin{align} (V + XUX^T)^{-1} &= V^{-1} - V^{-1}X(X^TV^{-1}X + U^{-1})^{-1}X^TV^{-1} \\ (V + XUX^T)^{-1}X &= V^{-1}X(X^TV^{-1}X + U^{-1})^{-1}U^{-1} \\ X^T(V + XUX^T)^{-1}X &= X^TV^{-1}X(X^TV^{-1}X + U^{-1})^{-1}U^{-1} \\ [X^T(V + XUX^T)^{-1}X]^{-1} &= U + (X^TV^{-1}X)^{-1} \end{align} $$
Combining these results in the correct manner (I'll omit the details for now, for the sake of brevity, but I would be happy to give them upon request), one can derive
$$M = (X^TV^{-1}X)^{-1}X^TV^{-1}$$
Update
For $z \ge 0$, let $$f(z, U) = [X^T(V + zI + XUX^T)^{+}X]^{-1} X^T (V + zI + XUX^T)^{+}.$$
Clearly, for $U\ge 0$, $f(z, U)$ is continuous on $z\in (0, \infty)$.
If $U$ is not positive definite, $f(z, U)$ is not necessarily right continuous at $z=0$. It is easy to give examples.
I guess that for $U>0$, $f(z, U)$ is right continuous at $z=0$. I did some simulation to see it. By the way, $(V + zI + XUX^T)^{+}$ is never right continuous at $z=0$ in the simulation (setting $V + XUX^T$ to be not positive definite).
$\phantom{2}$
Previously written
We can do a little bit more.
First, we give the following auxiliary result. The proof is given later.
Fact 1: Let $A > 0$ and $C\ge 0$. Assume that $B$ has linearly independent columns. Then we have $$[B^T(A + BCB^T)^{-1}B]^{-1} B^T (A + BCB^T)^{-1} = (B^T A^{-1} B)^{-1} B^T A^{-1}.$$
Now, let us prove our main result.
Lemma 1: For any $U$ such that $V + XUX^T > 0$, $M$ is independent of $U$.
Proof of Lemma 1: For $z \ge 0$, let $$f(z, U) = [X^T(V + zI + XUX^T)^{-1}X]^{-1} X^T (V + zI + XUX^T)^{-1}.$$
It suffices to prove that for any $U_1$ and $U_2$ such that $V + XU_1X^T > 0$ and $V + XU_2X^T > 0$, we have $f(0, U_1) = f(0, U_2)$.
First, from Fact 1, we have $f(z, U_1) = f(z, U_2)$ for any $z > 0$. Also, $f(z, U)$ is continuous for $z\ge 0$. Thus, we have $f(0, U_1) = f(0, U_2)$. This completes the proof of Lemma 1.
Proof of Fact 1:
By using Woodbury matrix identity $(P + QR)^{-1} = P^{-1} - P^{-1}Q(I + RP^{-1}Q)^{-1}RP^{-1}$, we have $$(A + BCB^T)^{-1} = A^{-1} - A^{-1}BC(I + B^TA^{-1}BC)^{-1} B^T A^{-1}$$ and \begin{align} B^T (A + BCB^T)^{-1} &= B^TA^{-1} - B^TA^{-1}BC(I + B^TA^{-1}BC)^{-1} B^T A^{-1}\\ &= (I + B^TA^{-1}BC)^{-1} B^T A^{-1} \end{align} and $$B^T(A + BCB^T)^{-1}B = (I + B^TA^{-1}BC)^{-1}B^T A^{-1} B.$$ Thus, we have \begin{align} &[B^T(A + BCB^T)^{-1}B]^{-1} B^T (A + BCB^T)^{-1}\\ =\ & (B^T A^{-1} B)^{-1}(I + B^TA^{-1}BC)(I + B^TA^{-1}BC)^{-1} B^T A^{-1}\\ =\ & (B^T A^{-1} B)^{-1} B^T A^{-1} . \end{align} This completes the proof of Fact 1.