Let $S$ be the sample standard deviation, based on a random sample of size $n$ from a distribution with pdf $f(x;\mu;\sigma ^2)$ with mean $\mu$ and variance $\sigma^2$
a) Show that $E(S) \le \sigma$, where equality holds iff $f(x;\mu;\sigma^2)$ is degenerate at $\mu$, $P[X = \mu] = 1$. Hint: Consider $Var(S)$
b) If $X_{i} \sim N(\mu,\sigma^2)$ find a constant $c$ such that $cS$ is an unbiased estimator of $\sigma$. Hint: Use the fact that $\frac{(n-1)S^2}{\sigma^2} \sim \chi^2(n-1)$ and $S = (S^2)^{\frac{1}{2}}$
For part a) here is what I did
$$ s^2 \le \sigma^2$$ $$ s^2 = E(s^2) - [E(s)]^2 \ge 0$$ we use the fact that $$Var(S) =0$$ $$[(S- \mu_{x})^2] = 0$$ $$S = \mu_{x}$$
so $$E(s^2) \ge [E(S)]^2$$ $$\sigma^2 \ge E(S)$$
They are equal when $S=\sigma=0$, x is degenerate.
Can anyone check my answer for part a) and help me with part b)?
(a)
You have that $S = \sqrt{\frac{1}{n-1}\sum_{i=1}^N(x_i - \bar{x})^2}$.
Taking the expectation of this we get $E[S] = E[\sqrt{\frac{1}{(n-1)}\sum_{i=1}^N(x_i - \bar{x})^2}]$. The square root is a concave function, so you can use Jensen's inequality to upper bound the expectation as:
\begin{equation} \begin{split} E[S] & = E[\sqrt{\frac{1}{(n-1)}\sum_{i=1}^N(x_i - \bar{x})^2}]\\ & \leq\sqrt{E[\frac{1}{(n-1)}\sum_{i=1}^N(x_i - \bar{x})^2}]\\ & = \sqrt{\sigma^2}\\ & = \sigma \end{split} \end{equation}
We also have that $Var[S] = E[S^2] - E[S]^2$. Note that $S^2$ is unbiased for the variance, thus $E[S^2] = \sigma^2$. By assumption we have that $E[S] = \sigma$, which gives us that $E[S]^2 = \sigma^2$. This implies that $Var[S] = 0$.
Using Markov's inequality we have:
\begin{equation} P[S\geq \sigma]\leq \frac{E[S]}{\sigma}\Leftrightarrow P[S\geq \sigma]\sigma \leq E[S] \end{equation}
This bound can only give equality of $E[S]$ and $\sigma$ when $P[S\geq \sigma] = 1$.
Examining $P[S\leq \sigma] = 1- P[S\geq \sigma] \leq 1 - \frac{E[S]}{\sigma}$
To show the distribution is degenerate we can do the following:
Note that if we examine $A_i = S_1'^{2} - \sigma^2$ Chebyschev's inequality still holds. Using the Borel Cantelli lemma on the events $\{A_i = I_{(|((X_i - \mu)^2 - \sigma^2)|\geq 0)}\}$. From (3) we have that the probability of these events are bounded above by $\frac{1}{k^2}$ for all $k$. Specify:
\begin{equation} \sum_{i=1}^\infty P[A_i] < \sum_{i=1}^\infty \frac{1}{i^2} < \infty \end{equation}
So by the BC lemma $(X - \mu)^2 - \sigma^2\overset{a.s.}{=}0$.
Re-arranging this equation gives us $X\overset{a.s.}{=}\sigma+\mu$, but because $X$ has a point mass distribution $\sigma = 0 $ which gives $X\overset{a.s.}{=}\mu$
(b) We have that $\frac{(n-1)S^2}{\sigma^2} \sim \chi^2_{n-1}$. Note that the expectation on the right is $(n-1)$. Dividing both sides by Dividing both sides by $\frac{(n-1)}{\sigma^2}$ gives: \begin{equation} S^2 \sim \frac{\sigma^2}{(n-1)}\chi^2_{n-1} \Rightarrow S \sim \frac{\sigma}{\sqrt{n-1}}\chi_{n-1} \end{equation}
And by by linearity of expectation under scalar multiplication the right hand side has the appropriate expectation. This suggests that if we start with an unstandardized $S$, we should transform it by $\frac{1}{\sqrt{n-1}}$: getting from $S \rightarrow S'$ where $S' = \frac{S}{\sqrt{n-1}}$.
This gets us from $S^2 \sim \sigma^2\chi^2_{n-1}\Rightarrow \frac{n-1}{n-1}S = (n-1)S'^2 \sim \sigma^2\chi^2_{n-1}$.
$\Rightarrow S'^2 \sim \sigma^2\chi^2_1$ which has the appropriate expectation.