Let $\varepsilon \in \left\{-1,1\right\}^n$ and $\mu \in \mathbb{R}^n$. For any $\eta \in \left(0,+\infty\right)^n$, let $$x^*(\eta)=\arg \min_{x \in \left(\mathbb{R}_+\right)^n} \sum_{k=1}^{n} \eta_k\left(\mu_k-\sum_{i \leq k}x_i\varepsilon_i\right)^2.$$ (I have proven that the minimum exists and is unique by proving that the function to minimize is strictly convex in $x$)
I'd like to prove that the function $x^*$ is continuous on $\left(0,+\infty\right)^n$.
Let me give some context, as it might be helpful to answer my question. For every $1 \leq k \leq n$, let $\lambda_k(\eta)=\sum_{i \leq k} x^*_i(\eta)\varepsilon_i.$ Then $\lambda(\eta)$ is the closest approximation of $\mu$ (in the weighted least square sense defined above) with the constraint $\operatorname{sgn}(\lambda_k-\lambda_{k-1})=\varepsilon_k$ for every $k$. Essentially, this constraint means that the variations or $\lambda$ are fixed.
Let me give an example. Let $\mu=\left(1, 2, 4, 2, 3\right)$, $\eta= \left(0.5, 0.25, 1, 0.25, 1\right)$, and $\varepsilon=\left(1, -1, 1, -1, -1\right).$
Here is a plot of $\mu$ and the best approximation $\lambda$ defined above : 
$\varepsilon$ encodes the variations of $\lambda$ : for example, $\varepsilon_2=-1$ forces $\lambda_2 \leq \lambda_1$.
Given the fact that the variations of $\lambda$ are fixed, it seems to me that slightly changing the weights $\eta$ would not change $\lambda\left(\eta\right)$ very much, which would mean that $\lambda$, and therefore $x^*$, is continuous. But I'm unable to prove it.