From Wikipedia and my textbook(*), the definition of a.s. convergence is like this
$X_n$ almost surely converges to $X$ if \begin{equation} P\left(\omega \in \Omega:\lim_{n\to \infty} X_n(\omega) = X(\omega)\right) = 1 \end{equation}
In this definition, $\omega$ is mentioned to define a.s. convergence (There are other definitions of a.s. convergence which do not mention $\omega$. This is also what I do not understand).
However, some problems in my homework confused me much. These two problems do not mention sample space $\Omega$. In the second problem, although it does not say $X_n$ almost surely converges to some $X$, it has readers judge whether $X_n$ converges almost surely. This is what I cannot understand.
Another related question also confused me. A random variable series $X_n$ \begin{equation} X_n = \left\{ \begin{aligned} 0, \ &0<\omega\le \frac{1}{2} \\ 1, \ &\frac{1}{2}<\omega \le1 \\ \end{aligned} \quad \text{n is even}\\ \right. X_n = \left\{ \begin{aligned} 1, \ &0<\omega\le \frac{1}{2} \\ 0, \ &\frac{1}{2}<\omega \le1 \\ \end{aligned} \right. \quad \text{n is odd} \end{equation} Of course $X_n$ does not converge a.s. or p.(convergence in probability). However if sample space $\Omega$ is omitted like this \begin{equation} X_n = \left\{ \begin{aligned} 0, \ \text{with probability} \ \frac{1}{2} \\ 1, \ \text{with probability} \ \frac{1}{2} \\ \end{aligned} \right. \end{equation}
In intuition, $X_n$ must converge since it is a constant, which contradicts the previous conclusion.
Above is my questions, which can be summarized as the title. Could any one help me? Thanks in advance.
* hajek.ece.illinois.edu/ECE534Notes.html
A random variable $X$ with valued $\mathbb{R}$ is always to be understood as a measurable map from a probability space $(\Omega,\mathbb{F},P)$ into $\mathbb{R}$. Then by definition $X_n \to X$ ($P$-)almost surely if $$ P(\{\omega\in \Omega: \lim_{n\to\infty}X_n(\omega)=X(\omega)\})=1. $$ That is the sequence of mappings $(X_n)$ converge point-wise to the map $X$ for all $\omega$'s in a set of $P$-measure equal to one. But one usually omits talking about the background probability space and also the $\omega$-dependence of the random variables. Whenever the background probability space is omitted one formally (in between the lines) consider a probability space and a mapping which satisfies the prescribed properties of the given random variables.
In regards to the example where you omit the background space, then the description if flawed. Because if we let the background space be $((0,1)],\mathcal{B}((0,1]),\lambda_{(0,1]})$ and $(X_n)$ be a sequence of mappings from this probability space to $\mathbb{R}$ defined by \begin{equation} X_n = \left\{ \begin{matrix} 0, &0<\omega\le \frac{1}{2} \\ 1, &\frac{1}{2}<\omega \le1 \\ \end{matrix} \right. \quad \\ \end{equation} when $n$ is even and \begin{equation} X_n = \left\{ \begin{matrix} 1, &0<\omega\le \frac{1}{2} \\ 0, &\frac{1}{2}<\omega \le1 \\ \end{matrix} \right.\quad \ \end{equation} when $n$ is odd. This sequence does not converge, but if we instead define \begin{equation} X_n = \left\{ \begin{matrix} 0, &0<\omega\le \frac{1}{2} \\ 1, &\frac{1}{2}<\omega \le1 \\ \end{matrix} \right. \quad \\ \end{equation} $\text{for all }n\in\mathbb{N}$, is does converge. Both satisfy the last description, but as you said the first does not converge but the latter does. Hence in this case additional information need to be given (or assumed) if one is able to determine convergence or not.
In regards to your homework problems they omittance of the background space does not create multiple possible solutions. The thing here that saves the day, is that one value of the random variables become more and more probable when $n$ tends to infinity. In the case of the second exercise let $X=0$ the constant mapping and note that e.g. by lemma 1.2.12 $$ P(|X_n-X|>\epsilon)= P(|X_n-0|> \epsilon)=P(X_n=1) = e^{-n}, $$ for any $\epsilon>0$. Hence $$ \sum_{k=1}^\infty P(|X_n-X|>\epsilon)=\sum_{k=1}^\infty e^{-n}=e^{-1}/(1-e^{-1})<\infty, \forall \epsilon >0 \implies X_n\stackrel{a.s.}{\longrightarrow}X $$