The question goes as follows:
A shoe factory produces brown shoes and black shoes. They look the same but differ only in their weight characteristics. Brown shoes have their weight distributed as Normal$(\mu = 7 \space\text{lbs}, \sigma^2)$ and black shoes as Normal$(\mu = 8 \space\text{lbs}, \sigma^2)$. They are produced in equal proportions and the black shoes are, on average, 1 lbs heavier than brown shoes.
a) Write an equation for the prior predictive distribution of the weight of a randomly selected pair of shoes.
b) You are provided a shoe, and find that it weighs 7 lbs. Find the conditional probability that it's a brown shoe, when $\sigma = 1$.
c) Explain how the probability that a 7 lbs shoe is brown will vary as $\sigma$ increases and decreases.
I am mostly interested in parts (a) and (b), as I think for (c), the answer is that as $\sigma$ goes down, the probability that a shoe is brown will increase and vice versa.
For (a), my initial thoughts are that it would be $\text{Normal}(\mu = 7.5, 2\sigma^2)$ as I think the variances will add. However, I'm not quite sure of whether this is the correct mean.
Fort part (b), I'm really confused and would appreciate any detailed help. Many thanks!
a) your approach is unfortunately incorrect - the mixture of two Gaussians is not a Gaussian. This is in stark contrast to the fact that the sum of two (jointly) Gaussian random variables is a Gaussian.
Here's the idea - each pair of shoes is either black or brown - it can't be fractionally black and brown. In the latter case, the weight would be a sum of two Gaussian random variables in some proportion, and the approach you're using would apply. Here, though, the idea is to just use the law of total probability: let $W$ be the weight, then $$ W\sim P(\textrm{Black}) \times P(W|\textrm{Black}) + P(\textrm{Brown}) \times P(W|\textrm{Brown}) $$
which translates to $$W \sim \frac{1}{2}\mathcal{N}(7, \sigma^2) + \frac{1}{2}\mathcal{N}(8,\sigma^2),$$
by which I mean that the pdf $f_W(w) = 0.5 \phi_{7,\sigma^2}(w) + 0.5 \phi_{8,\sigma^2}(w)$ where $\phi_{\mu, \sigma^2}$ is the Gaussian pdf. Note that in general this cannot be simplified further.
b) Conditional probabilities given an observation should always make you think of Bayes' theorem. Here we have to consider the case of densities.
I'll represent Black and Brown using $c$ for colour. We know $f_{W|\textrm{colour}}(w|c),$ and we're being asked to construct $f_{\textrm{colour}|W}(c |w)$ (which is just a probability since colour is a discrete variable). Bayes' rule says that $$ P_{\textrm{colour}|W}(\textrm{Brown}|w) = \frac{f_{W|\textrm{colour}} (w | \textrm{Brown}) \times P(\textrm{Brown})}{f_W(w)}.$$ Presumably I can leave the computations to you.
c) Your intuition is correct. It might be a useful as an exercise to try to prove it mathematically, though. In case you want to do so, it may be interesting to study $ \frac{\mathrm{d}\phantom{\sigma^2}}{\mathrm{d} \sigma^2} P_{\textrm{colour}|W}(\textrm{Brown}|w) .$
For practise's sake you might want to re-compute the mean and variance of $W$. In particular, the variance is not $2\sigma^2$ (In fact it isn't that even for the sum of Gaussian random variables approach you were taking, so also try that case again.)