Let $B_t,\;t\geq0$ be a standard Brownian motion. Define the stopping time $$\tau = \min_t\{B_t^2\geq t+1\}$$
Is the expected value $E(\tau)$ finite?
Actually, my raw problem as following: $$\gamma = \min_t\{ B_t = \sqrt{t+1}\quad \text{or}\quad B_t=-1\}$$ I want to prove $E(\gamma) < \infty$ by $E(\tau) < \infty$.
As commented by @zhoraster, since $B_t^2-t$ is a martingale, using stopping time theorem we have $$E(B_\tau^2-\tau) = E(B_0^2) = 0$$ However, $E(B_\tau^2-\tau) = E(1) = 1$, is this imply $E(\tau) = \infty$ ?
So I can't prove $E(\gamma) < \infty$ by $\tau$. I have simulated using compute program which shows that $E(\gamma) < 10$.
Your guess is correct but is missing some important details. In order to use the optional sampling theorem, you need some uniform integrability or boundedness of $\tau$.
But it is possible to complete this argument. Assume that $E[\tau]<\infty$ (in particular, $\tau<\infty$ a.s.). Then, by the optional sampling theorem, for each $n\ge 1$ $$ E[B^2_{\tau\wedge n}] = E[\tau\wedge n]. $$ Therefore, letting $n\to\infty$, we get by the Fatou lemma (in the lhs) and the monotone/dominated convergence theorem (in the rhs), $$ E[B^2_\tau]\le E[\tau], $$ whence, as you noted, it follows that $1\le 0$, which is absurd.
This answers the question as originally posted.
For a modified question (with lower bound being $-1$), the expectation of the stopping time is finite.
Idea First, it can be shown that the stopping time $$ \sigma = \inf\left\{s\ge 0: |B_s| = \frac12(1+\sqrt{1+s})\right\} $$ is integrable.
Indeed, for any $t\ge 0$ $$ 0=E\left[B^2_{\sigma\wedge t}-\sigma\wedge t\right]\le E\left[\frac14\left(1+\sqrt{1+\sigma\wedge t}\right)^2-\sigma\wedge t\right]\\= \frac12 + E\left[\frac12\sqrt{1+\sigma\wedge t}-\frac34\sigma\wedge t\right] \le 1 - \frac12 E[\sigma\wedge t], $$ whence $E[\sigma\wedge t]\le 2$. Letting $t\to+\infty$, we get $E[\sigma]\le 2$ by virtue of the Fatou lemma.
Now for each $t>0$ $$ P(\gamma>t) = P(W \in G_t) $$ with $$ G_t = \big\{f: \forall s\in[0,t]\ f(s)\in (-1,\sqrt{1+s})\big\}. $$ Similarly, for each $t>0$ $$ P(\sigma>t) = P(W \in S_t). $$ It is easy to see that $S_t = \frac12(G_t - G_t)$. Then, by the log-convexity of the distribution of $W$, $$ P(\sigma>t) = P(W \in S_t) \ge \left(P(W \in G_t)P(W \in -G_t)\right)^{1/2} = P(W\in G_t) = P(\gamma>t), $$ where we have used the symmetry of $W$. Therefore, $\gamma$ also has finite expectation, moreover, $E[\gamma]\le E[\sigma]<2$.