I was recently working on the following Tripos problem and hit multiple road bumps:
The random variables $S_1,...,S_n$ take on values in $\{\pm1\}$, and follow the probability distribution $$\mathbb{P}(S_1=s_1,...S_n=s_n)=\frac1{Z_{n,\beta}}\exp\left(\frac{\beta}{2n}\sum_{i=1}^n\sum_{j=1}^ns_is_j\right),$$ where $\beta$ is a positive constant and $Z_{n,\beta}$ is the normalisation constant $$Z_{n,\beta}=\sum_{s_1\in\{\pm1\}}...\sum_{s_n\in\{\pm1\}}\exp\left(\frac{\beta}{2N}\sum_{i=1}^n\sum_{j=1}^ns_is_j\right).$$
i) Show that $\mathbb{E}(S_i)=0$ $\forall$ $i$; ii) Show that $\mathbb{P}(S_2=+1|S_1=+1)\geq\mathbb{P}(S_2=+1)$ (you may quote without proof the result $\mathbb{E}(S_iS_j)\geq0$ $\forall$ $i$, $j$); iii) We define the random variable $M$ as $$M=\frac1{n}\sum_{i=1}^nS_i.$$ Show that $M$ takes values in the set $$E_n=\left\{\frac{2k}n-1:0\leq k\leq n,\text{ }k\in\{\mathbb{Z}^+\cup0\}\right\},$$ and that for each $m\in E_N$, the number of possible values of $(S_1,...,S_n)$ such that $M=m$ is $$\frac{n!}{\displaystyle\left[\frac{(1+m)n}2\right]!\left[\frac{(1-m)n}2\right]!}.$$ Find $\mathbb{P}(M=m)$ for any $m\in E_n$.
For part i), I tried to use that for a particular $k$, $$\mathbb{E}(S_k)=\frac{\displaystyle\sum_{s_1\in\{\pm1\}}...\sum_{s_n\in\{\pm1\}}s_k\text{exp}\left(\frac{\beta}{2n}\sum_{i=1}^n\sum_{j=1}^ns_is_j\right)}{\displaystyle\sum_{s_1\in\{\pm1\}}...\sum_{s_n\in\{\pm1\}}\text{exp}\left(\frac{\beta}{2n}\sum_{i=1}^n\sum_{j=1}^ns_is_j\right)}.$$ I have tried to expand it out, try for small values of $n$, and take the sum inside the exponent out by converting it into a product, but all have led me nowhere.
I have no idea how to approach part ii), the hint of $\mathbb{E}(S_iS_j)\geq0$ made me think of some form of variance-involved inequality like Chebyshev's, but this was not successful, and neither was using Bayes' theorem (although my hunch is that it has to be used somehow). Additionally, out of curiosity, how does one prove $\mathbb{E}(S_iS_j)\geq0$?
For part iii), my thought process is: suppose there are $k\times$ +1's and $n-k\times$ -1's. Then, the particular value of $M$ is $$m=\frac1n[k-(n-k)]=\frac{2k}{n}-1.$$ When $M=m$, $k=n(1+m)/2$, $n-k=n[2-(1+m)]/2=n(1-m)/2$, and we could be interpret the quantity requested as the number of arrangements of $n(1+m)/2$ +1's and $n(1-m)/2$ -1's, and the number of arrangements is simply $$\binom{n}{k}=\frac{n!}{k!(n-k)!}=\boxed{\frac{n!}{\displaystyle\left[\frac{(1+m)n}2\right]!\left[\frac{(1-m)n}2\right]!}}.$$ Following from that, I am unsure if this idea is correct. Symmetry under exchange of i and j indicates that each configuration is equally likely. This is the binomial distribution, so $$\mathbb{P}(M=m)=\binom{n}{k}\left(\frac12\right)^{n}=\frac{n!}{2^n\displaystyle\left[\frac{(1+m)n}2\right]!\left[\frac{(1-m)n}2\right]!}.$$ This feels like I am brushing many details under the rug, could someone confirm this for me?
Let us denote $H(s)=\frac{1}{2n}\sum_{i=1}^{n}\sum_{j=1}^{n}s_is_j$, for convience.
For part (i), note that as you've observed: $$\mathbb{E}S_k=\frac{1}{Z}\sum_{s_i}s_ke^{\beta H(s)}$$ On the other hand note that by replacing $(s_1,\dots s_n)\mapsto (-s_1,\dots -s_n)$, we have that $s_ke^{\beta H(s)}\mapsto -s_ke^{\beta H(s)}$. From this we easily see that the sum cancels and we are let with $\mathbb{E} S_k=0$.
For part (ii), note that: $$\mathbb{E}S_1S_2=\mathbb{P}(S_1=1,S_2=1)+\mathbb{P}(S_1=-1,S_2=-1)-\mathbb{P}(S_1=-1,S_2=1)-\mathbb{P}(S_1=1,S_2=-1)$$ by symmetry, we can reduce this to: $$2(\mathbb{P}(S_1=1,S_2=1)-\mathbb{P}(S_1=-1,S_2=1)$$ By using the law of total probability, we can rewrite this as: $$2(2\mathbb{P}(S_1=1,S_2=1)-\frac{1}{2})$$ As $\mathbb{E}S_1 S_2\ge 0$, we see that $\mathbb{P}(S_1=1,S_2=1)\ge \frac{1}{4}$. From part (i) we see that $\mathbb{P}(S_1=1)=\frac{1}{2}$, and so by using Bayes' theorem we can easily get the desired result.
For part (iii), your proof looks fine.