Suppose we are given two characteristic functions: $\phi_1,\phi_2$ and I want to take a weighted average of them as below:
$\alpha\phi_1+(1-\alpha)\phi_2$ for any $\alpha\in [0,1]$
Can it be proven that the result is also a characteristic function? If so, I am guessing this result could extend to any number of combinations $\alpha_i$ as long as $\sum_i\alpha_i=1$
Secondly, if $\phi$ is again a characteristic function, then $\mathfrak{R}e\phi(t)=\frac12(\phi(t)+\phi(-t))$ is also a characteristic function. I don't even know how to begin attempting this proof as I am not sure what the $\mathfrak{R}$ represents.
Lastly, regarding the symmetry of characteristic functions,
$\phi$ is symmetric about zero iff it is real-valued iff the corresponding distribution is symmetric about zero.
Once again, my lack of familiarity with the complex plane leaves me in the dark here. Why can a complex-valued function not be symmetric about zero?
To prove that these are characteristic functions, using random variables yields simpler, more intuitive, proofs.
In the first case, assume that $\phi_1(t)=\mathrm E(\mathrm e^{itX_1})$ and $\phi_2(t)=\mathrm E(\mathrm e^{itX_2})$ for some random variables $X_1$ and $X_2$ defined on the same probability space and introduce a Bernoulli random variable $A$ such that $\mathrm P(A=1)=\alpha$ and $\mathrm P(A=0)=1-\alpha$, independent of $X_1$ and $X_2$. Then:
The extension to more than two random variables is direct. Assume that $\phi_k(t)=\mathrm E(\mathrm e^{itX_k})$ for every $k$, for some random variables $X_k$ defined on the same probability space and introduce an integer valued random variable $A$ such that $\mathrm P(A=k)=\alpha_k$ for every $k$, independent of $(X_k)_k$. Then:
In the second case, assume that $\phi(t)=\mathrm E(\mathrm e^{itX})$ for some random variable $X$ and introduce a Bernoulli random variable $A$ such that $\mathrm P(A=1)=\mathrm P(A=-1)=\frac12$, independent of $X$. Then: