Let $x_{n} = 0 $ if $n < 100 $ and $x_{n} = 1$ if $n \geq 100$ prove $x_n$ converges and find its limit.
I started by letting $\epsilon > 0$, as per normal, and choosing $n \geq 100$ as well as $L = 1$
Therefore $ | x_n - L | < \epsilon$, using our numbers $|x_n - 1| < \epsilon$ and since we said $ n \geq 100$, $x_n = 1$ by how $x_n$ was defined. Therefore $ 0 < \epsilon $, which is true. Therefore $x_n$ converges and the limit equals one.
I also have let $x_n = \frac{n-1}{n} $, prove $x_n$ converges and find the limit.
I worked this one out in a similar way, but I feel like my method is wrong. For instance I know from past calculus classes the limit is 1, but that doesn't mean I can prove it. In class the problems we solved were $x_n = \frac{1}{n} $ prove $x_n \to 0$. We were readily given what value L was. I'm not comfortable pulling an L (seemingly) out of nowhere and using it in the proof. Like I did above, with L = 0. It seems too easy.
Seems nothing wrong here.
By definition $L$ is just the limit value you're going to prove. If whenever you get a "challenge" (the absolute error between $x_n$ and $L$ must less than $\epsilon$), you can always find such $N \in \mathbb{N}$ s.t. for all $n \geq N$, $x_n$ can always accomplish the challenge, then we can expect the error between the numbers in sequence and $L$ is in some degree definitely decreasing to 0, and say $L$ is the limit of the sequence.