There is an example in a textbook (without any explanation) on finding $\lambda$, and I am struggling to know how to explicitly find $\rho$ and $\lambda$ such that $\lambda' \beta = \rho'X\beta$, i.e., is estimable function.
In the textbook, "an estimable function, $X\beta$ can be thought of as a vector of inner products between $\beta$ and the spanning set for $C(X')$. In particular when $\lambda$ is $p \times 1$ vector, $\lambda \in C(X')$ i.e., $\lambda = X'\rho$ or more generally, $$\Lambda' \beta = P' X \beta$$ for some matrix $P$. An important property of estimable functions is that $P$ need not be unique."
Example from the textbook (Simple Linear regression): $y_i = \beta_0 + \beta_x x_i + e_i$, $i = 1\ldots, 6$, where $e_i \sim N(0, \sigma^2_i)$. In matrix form: $$\begin{bmatrix} y_1 \\ \vdots \\y_6 \end{bmatrix} = \begin{bmatrix} 1 & 1 \\ 1 & 2 \\ 1 & 3 \\ 1 & 4 \\ 1 & 5 \\ 1 & 6 \end{bmatrix} \begin{bmatrix} \beta_0 \\ \beta_1 \end{bmatrix} + \begin{bmatrix} e_1 \\ \vdots \\e_6 \end{bmatrix}$$ so that $$\frac{1}{35}\begin{pmatrix} -5, & -3, & -1,& 1,& 3, & 5 \end{pmatrix} \begin{bmatrix} 1 & 1 \\ 1 & 2 \\ 1 & 3 \\ 1 & 4 \\ 1 & 5 \\ 1 & 6 \end{bmatrix} \begin{bmatrix} \beta_0 \\ \beta_1 \end{bmatrix} = \begin{pmatrix} 0,1 \end{pmatrix} \begin{bmatrix} \beta_0 \\ \beta_1 \end{bmatrix}= \beta_1$$
and
$$\frac{1}{30}\begin{pmatrix} 20, & 14, & 8,& 2,& -4, & -10 \end{pmatrix} \begin{bmatrix} 1 & 1 \\ 1 & 2 \\ 1 & 3 \\ 1 & 4 \\ 1 & 5 \\ 1 & 6 \end{bmatrix} \begin{bmatrix} \beta_0 \\ \beta_1 \end{bmatrix} = \beta_0$$
Apparently $\rho' = \frac{1}{35}\begin{pmatrix} -5, & -3, & -1,& 1,& 3, & 5 \end{pmatrix} $ and $\lambda= \begin{pmatrix} 0, 1 \end{pmatrix}$ in the first case and $\rho' = \frac{1}{30}\begin{pmatrix} 20, & 14, & 8,& 2,& -4, & -10 \end{pmatrix}$ in the second case. How can one explicithy find $\rho$ or $P$ even though $\rho$ or $P$ is not unique?