The exponential distribution family is defined by a PDF of the form: $$ f_X(x;\theta) = c(\theta) g(x) \exp \left[\sum_{j=1}^l G_j(\theta) T_j(x)\right],$$ where $\theta \in \Theta$ and $c(\theta)>0$, $Q_j(\theta)$ are arbitrary functions of $\theta$, and $g(x)>0$ and $t(x)$ are arbitrary functions of $x$.
This seems very complicated to an untrained eye and honestly, I don't think I understand it.
I researched the mighty internet and found a simplified form: $$ f(x) = \exp\left[\frac{\theta(x)-b(\theta)}{a(\Phi)}+c(x,\Phi)\right],$$ which seems much more user friendly for beginners.
However, there is an extension to the first exponential family PDF definition, such that, by applying the factorization theorem to the joint PDF $f_X(\mathbf{x};\theta)$, one obtains the sufficient statistic: $$ \mathbf{T}(\mathbf{X}) = \left(\sum_{i=1}^n T_1(X_i),\;...\;,\sum_{i=1}^nT_l(x_i)\right),$$ which is a sufficient statistic for $Q_1(\theta),...,Q_l(\theta)$.
How can the sufficient statistic be obtained from the simplified version of the exponential family form?
How can variance and mean be calculated from the first definition of the exponential family form?
EXAMPLE:
Prove that Poisson distribution belongs to the exponential family.
Question: How can $\mathbb{E}[X]$ and $\operatorname{Var}[X]$ be calculated here?
I was able to do this with the simplified form that I found online, as follows:
$$ f(x)= \frac{\lambda^xe^{-\lambda}}{x!}$$ and by taking the log and then the exponential on both sides one gets: $$ f(x)= \exp\left[x \log(\lambda) - \lambda - \log(x!)\right]$$ Matching this expression to the simplified form of the exponential family, we get: $\theta = \log(\lambda)$
$\lambda = \exp(\theta)$
$b(\theta)=e^{\theta}$
$a(\Phi)$, always 1 for distributions with one parameter
$c(x, \Phi)= -\log(x!)$
Then one can easily get:
$$\begin{align}\mathbb{E}[X]&= b'(\theta) = \lambda\\ \operatorname{Var}[X] &= a(\Phi)b''(\theta)= \lambda\end{align}$$
How can the sufficient statistics be determined from this simplified form?

First question
Inspecting the definition of the exponential family $$ f_x(x;\theta) = c(\theta) g(x) e^{ \sum_{j=1}^l G_j(\theta) T_j(x) }, $$ one can say the following:
$T$ is a sufficient statistic. Condition on $T$, the conditional distribution is $g(x)$ (up to a normalization constant), which is independent of the parameter $\theta$. This is the definition of sufficiency. In fact, for the exponential family it is independent of $T$.
The term $e^{ \sum_{j=1}^l G_j(\theta) T_j(x) }$ determines the marginal distribution of $T$, via the choice of $G_j$'s.
$c(\theta)$ is a normalization constant so the density integrates to $1$.
Second question
As $G_j$'s are arbitrary, subject to measurability requirements etc., there is no general formula for computing moments. For the Poisson distribution, the first moment is simply $$ e^{-\lambda} \sum_{k = 0}^{\infty} k \frac{\lambda^k}{k!} = ( e^{-\lambda} \sum_{k = 1}^{\infty} \frac{\lambda^{k-1} }{(k-1)!}) \cdot \lambda = \lambda. $$
The second moment is similar.