Why is $Var(\hat{\beta_0}+\hat{\beta_1}x^*)=\frac{1}{n}+(\frac{(x-x^*)^2}{S_{xx}})\sigma^{2}$ ?
Although I tried to separate them to $Var(\hat{\beta_0})+Var(\hat{\beta_1}x^*)+2Cov(\hat{\beta_0},\hat{\beta_1x^*})$, but I am not sure how to continue with $Var(\hat{\beta_1}x^*)$.
$\newcommand{\E}{\mathrm{E}}$ $\newcommand{\Var}{\mathrm{Var}}$ $\newcommand{\Cov}{\mathrm{Cov}}$ There might be other ways also, but I am just giving an outline of how I did it in my Linear Models class (similar to that mentioned in Applied Linear Statistical Models by Kutner).
I have assumed my model to be of the form: $$Y_i=\beta_0+\beta_1\cdot X_i+\epsilon_i$$ with the $\beta_1=\E\left[\beta_1^*\right]$ and $\beta_0=\E\left[\beta_0^*\right]$.
Thus, my estimated value of $Y$ at $x^*$ is $$\hat{Y}^*=\beta_0^*+\beta_1^*\cdot x^*$$
Note that with some manipulations, you can write $$\beta_1^*=\beta_1+\sum_{i=1}^n{\frac{X_i-\bar{X}}{\sum_{i=1}^n\left(X_i-\bar{X}\right)^2}\epsilon_i}$$
Find $\Var\left(\beta_1^*\right)$ from the above expression. Now you can use the alternate form of the regression which is $$\frac{\hat{Y}^*-\bar{Y}}{x^*-\bar{x}}=\beta_1^*\\\implies\hat{Y}^*=\bar{Y}+\left(x^*-\bar{x}\right)\beta_1^*$$
Thus, we have, $$\Var\left(\hat{Y}^*\right)=\Var\left(\bar{Y}\right)+\left(x^*-\bar{x}\right)^2\cdot\Var\left(\beta_1^*\right)+\left(x^*-\bar{x}\right)\Cov\left(\bar{Y},\beta_1^*\right)$$
What can be the value of $\Cov\left(\bar{Y},\beta_1^*\right)$?
Finally, we get, from the above equation that $$\Var\left(\hat{Y}^*\right)=\frac{\sigma^2}{n}+\left(x^*-\bar{x}\right)^2\cdot\frac{\sigma^2}{\sum_{i=1}^n\left(X_i-\bar{X}\right)^2}\\ =\frac{\sigma^2}{n}\left\{1+\frac{\left(x^*-\bar{x}\right)^2}{S_{xx}}\right\}$$
P.S.: There seems to be a typo in the question that you provided.