$\mathbf{\text {Show}}$ $\mathbf{E[\hat{\beta _1}]=\beta _1}$
I have already been able to prove
$ \hat{\beta_1}= \frac{\sum_{i=1}^n (x_i-\bar{x})}{\sum_{i=1}^n (x_i- \bar{x})^2}$
I rewrite it as
$ \hat{\beta_1}= \sum^n_{i=1} c_i y_i$
Taking expectation on both sides:
$ E[\hat{\beta_1}]= E[\sum^n_{i=1} c_i y_i]$
$ E[\hat{\beta_1}]= \sum^n_{i=1} c_i E[y_i]$
$ E[\hat{\beta_1}]= \sum^n_{i=1} c_i [\beta_0 + \beta _1 x_i]$
$ E[\hat{\beta_1}]= \beta_0\sum^n_{i=1} c_i + \beta _1 \sum^n_{i=1} c_ix_i$
In order to proceed, I need to show
$\mathbf{\sum^n_{i=1} c_i =0}$ $\mathbf{\text{and}}$ $\mathbf{\sum^n_{i=1} c_ix_i=1}$.
This is where I am getting stucked. There is no duplication since it does mot involve $Ex_i |u_i]$
To show that $\sum c_i=0$ and $\sum c_ix_i=1$, it's important to understand that the expression $$ c_i := \frac{x_i-\bar x}{\sum_{i=1}^n (x_i-\bar x)^2}$$ has a numerator that depends on $i$, but the denominator does not. In the denominator the $i$ is a bound variable (it's an index of summation) so the denominator could just as well be written $\sum_{k=1}^n (x_k-\bar x)^2$, or even viewed as a constant, call it $A$, that doesn't depend on $i$. So for laziness we can write $$c_i := \frac{x_i-\bar x}A. $$ Now calculate, using rules of summation:
$$\sum c_i = \sum \left(\frac{x_i-\bar x}A\right)=\frac1A\left(\sum x_i-\sum\bar x\right),\tag1 $$ where all summations are understood to run from $i=1$ to $n$. To see why (1) equals zero, notice that $\sum\bar x$ is summing $n$ copies of the constant $\bar x$, so it equals $n\bar x$. Finally use the definition $\bar x:=\frac1n\sum x_i$ to conclude $\sum c_i=0$.
Here is a tricky way to show $\sum c_ix_i=1$. Calculate $\sum c_i(x_i-\bar x)$ in two ways: $$\sum c_i(x_i-\bar x)=\sum c_ix_i - \bar x\sum c_i=\sum c_ix_i,\tag2$$ where in the last equality we use $\sum c_i=0$. But also: $$\sum c_i(x_i-\bar x)=\sum \left(\frac{x_i-\bar x}{A} \right)(x_i-\bar x)=\frac{\sum(x_i-\bar x)^2}A=1.\tag3$$ Since (2) and (3) are equal, conclude that $\sum c_ix_i=1$.