Given a sequence of random variable $X_n$, $\sqrt{n}(\bar X - \mu) \rightarrow^d N(\mu, \sigma^2) $ as $n \rightarrow \infty$.
Let $g(X)$ be a continuous function. By the delta method, one can assert that: $\sqrt{n}( g(\bar X) - g(\mu)) \rightarrow^d N (0, \frac{dg(\bar X)}{dX}^2 \sigma^2) $
However, given a linear function s.t, where $a, b$ $> 0$ and constant:
$a \sqrt{n}(\bar X - \mu) + b$.
How does this affect the asymptotic normal mean? i.e - how is ? changed $N (?, \frac{dg(\bar X)}{dX}^2 \sigma^2) $