I am trying to assess the bias of a regression model such that the true model, $y=X\beta+u$ is amended such that the true $X$ is replaced with $\widetilde{X} = X + \epsilon$, where $\epsilon$ is a vector of measurement errors. I'm working with the assumptions that $\epsilon_1=0$, that $\epsilon_j$ is uncorrelated with $\epsilon_u$, and that $E[\epsilon |\widetilde{X}] = \mu$.
I've gotten as far as deriving the full expected value formula of $\widehat\beta$: $$ \begin{align*} E[\widehat\beta | X] &= E[(\widetilde{X}'\widetilde{X})^{-1}\widetilde{X}'y] \\ &= E[((X+\epsilon)'(X+\epsilon))^{-1} (X+\epsilon)'(X\beta+u)] \\ &= E[(X'X+X'\epsilon + \epsilon'X + \epsilon'\epsilon)^{-1}(X'X\beta + X'u + \epsilon'X\beta + \epsilon'u)] \end{align*} $$
But I'm not sure how to aggregate the $\epsilon$ out into $\epsilon_1$ to $\epsilon_j$.
This is called "classical errors-in-variables" and has proofs for the single-variable version on Wikipedia and here. I think you are making it harder than it has to be and also missing some key points in the OLS proof.
As for the method, yours looks solid but then peters out. A good way to continue a second attempt is, instead of plugging in for $\widetilde X$ to get everything in terms of $X$, plug in for $X$ to get everything in terms of $\widetilde X$. So $y = X\beta + u = \widetilde X \beta - \epsilon \beta + u$.
Then $$ \begin{align*} E(\widehat \beta | \widetilde X) &= E\left[(\widetilde X' \widetilde X)^{-1} \widetilde X' y \, | \, \widetilde X\right] \\ &= E\left[(\widetilde X' \widetilde X)^{-1} \widetilde X' (\widetilde X \beta - \epsilon \beta + u) \, | \, \widetilde X\right] \\ &= E\left[(\widetilde X' \widetilde X)^{-1} \widetilde X' (\widetilde X \beta - \epsilon \beta + u) \, | \, \widetilde X\right] \\ &= E\left[\beta + \frac{\widetilde X' u}{\widetilde X' \widetilde X} - \frac{\widetilde X'\epsilon}{\widetilde X' \widetilde X} \beta \, | \, \widetilde X\right] \\ &= \beta\left[1 - \frac{\widetilde X' \mu}{\widetilde X' \widetilde X}\right]. \\ \implies E(\widehat\beta) &= \beta E\left[1 - \frac{\widetilde X' \mu}{\widetilde X' \widetilde X}\right] \\ \implies \text{Bias}(\widehat\beta) &= -\beta E\left[\frac{\widetilde X' \mu}{\widetilde X' \widetilde X}\right] \end{align*} $$
The second-to-last line is implied by law of total expectation: $E(\widehat\beta, \beta) = E_{\widetilde X}[ E(\widehat \beta | \widetilde X)]$.
Notice this bias looks like an omitted variable bias. In fact, another way to frame the problem is that we're omitting $-\epsilon$ from the model specification, even though it appears in our observations.
A good way to practice: what if the error was on the observed dependent variable (so $y = y^* + \epsilon$ where $y^*$ is the true value of $y$)?