Ordinary least square estimates for $X_{n \times m}\vec{\beta}=\vec{y}$
$$\vec{\beta}_{OLS}=(X^TX)^{-1}X^T\vec{y}$$
is an unbiased estimator of $\vec{\beta}$ in which it has the smallest variance among all possible linear unbiased estimates. Meanwhile, Principal Component Regression (PCR) is a biased estimator
$$\vec{\beta}_{PCR}=\Gamma_k \Gamma^T_k \vec{\beta}_{OLS}$$
where $\Gamma_k$ is a matrix in which the columns are the first k eigenvectors of $X^TX$. $\vec{\beta}_{PCR}$ is simply the orthogonal projection of $\vec{\beta}_{OLS}$ onto the space spanned by the first k eigenvectors of $X^TX$
Can anyone explain how a projection will lead to a "biasness" in $\vec{\beta}$