I am getting stuck on some silly manipulation of expected values in this problem.
Let $X(\omega)=\sum_{i=1}^{n} \chi_{i}(\omega)$ where $\chi_{i}(\omega)$ has the uniform distribution on the set $\{1,...,n\}$. Find variance of $X(\omega)$.
The first part of the problem was find expected value of $X(\omega)$. I figured that out fine. I am making some mistake in computing the variance. So far I have:
$\newcommand{\Var}{\operatorname{\mathsf{Var}}} \newcommand{\E}{\operatorname{\mathsf{E}}} \begin{align}\Var(X(\omega)) & =\E\Bigl(\bigl(X(\omega)-\E\bigl(X(\omega)\bigr)\bigr)^2\Bigr) \\ & =\E\Bigl(\bigl(\sum_{i=1}^{n} \chi_{i}(\omega)-\tfrac{1}{n}\bigr)^2\Bigr)\\ & =\sum_{i=1}^{n}\E\Bigl(\bigl(\chi_{i}(\omega)-\tfrac{1}{n}\bigr)^2\Bigr)+2\sum_{i<j}\E\Bigl(\bigl(\chi_{i}(\omega)-\tfrac{1}{n}\bigr)\bigl(\chi_{j}(\omega)-\tfrac{1}{n}\bigr)\Bigr) \\ & = 1-\tfrac{1}{n}+2\sum_{i<j}\E\Bigl(\chi_{i}(\omega)\chi_{j}(\omega)-\tfrac{1}{n}\chi_{j}(\omega)-\tfrac{1}{n}\chi_{i}(\omega)+\tfrac{1}{n^2}\Bigr)\end{align}$
I know the sum of these mixed terms should be $\frac{1}{n}$ but I can't figure out how to show it. Any help clarifying would be appreciated.
Take the random variables to be implicit functions of the outcomes, $\omega$, to avoid bloated notation.
$\newcommand{\Chi}{\raise{0.5ex}{\chi}}\newcommand{\E}{\operatorname{\mathsf E}}\newcommand{\Var}{\operatorname{\mathsf {Var}}}\newcommand{\Cov}{\operatorname{\mathsf {Cov}}}$Use the independence and identicality of the distribution of $(\Chi_i)_{i\in\{1..n\}}$.
$\E[{\sum}_i \Chi_i^2]=n\E[\Chi_1^2]$ and $\E[{\sum}_{i\neq j}\Chi_i\Chi_j]=n(n-1)\E[\Chi_1]^2$
Thus:
$\begin{align} \Var [X] &= \E[(X-\E[X])^2] \\[1ex] & = \E[X^2-2X\E[X]+\E[X]^2] \\[1ex] & = \E[X^2]-2\E[X]^2+\E[X]^2\\[1ex] &= \E[X^2]-\E[X]^2 \\[1ex] & = \E[{\sum}_i {\sum}_j \Chi_i\Chi_j] - (\E[{\sum}_k \Chi_k])^2 \\[1ex] & = n\E[\Chi_1^2]+n(n-1)\E[\Chi_1]^2 -(n\E[\Chi_1])^2 \\[1ex] & = n\Var[\Chi_1]+n\E[\Chi_1]^2+n(n-1)\E[\Chi_1]^2 - n^2\E[\Chi_1]^2 \\[2ex]\therefore\quad \Var[X] & = n\Var[\Chi_1] \end{align}$
PS: If the chi variables are not independent, for some reason, then the extra terms will not cancel, and rather you will naturally have a covariance term, as instead: $\E[{\sum}_{i\neq j}\Chi_i\Chi_j]=n(n-1)\E[\Chi_1\Chi_2]$
$\begin{align} \Var [X] &= \E[{\sum}_i {\sum}_j \Chi_i\Chi_j] - (\E[{\sum}_k \Chi_k])^2 \\[1ex] & = n\E[\Chi_1^2]+n(n-1)\E[\Chi_1\Chi_2] -(n\E[\Chi_1])^2 \\[1ex] & = n\Var[\Chi_1]+n\E[\Chi_1]\E[\Chi_2]+n(n-1)\E[\Chi_1\Chi_2] - n^2\E[\Chi_1]\E[\Chi_2] &~\because~& \E[\Chi_1]^2=\E[\Chi_1]\E[\Chi_2] \\[2ex]\therefore\quad \Var[X] & = n\Var[\Chi_1]+n(n-1)\Cov[\Chi_1,\Chi_2] \end{align}$
PPS: If $\Chi_1\sim\mathcal U\{1.. n\}$ then $\E[\Chi_1] =\frac{n+1}2$. You should anticipate that the average is not going to be less than the smallest value that can be realised.
You appear to be thinking of $\mathsf P(\Chi_1=k)= \frac 1n$ for all $k\in\{1..n\}$; the probability mass function of a uniform(discrete) distribution.