Say that given an estimator $\hat{T}$, of a statistic $T$, we have that $\sqrt{n}(\hat{T}-T) \overset{{\strut\text{D}}}\rightarrow \mathcal{N}(0,Var)$, where $n$ is the sample size. Consider now $-\hat{T}$, does the asymptotic result hold also in this case? If yes/no, under which conditions?
EDIT: more information to make te problem clear. Say that we are interested in the following statistic
\begin{equation} S= \left\{ \begin{array}{ll} 1-T &\text{if}\quad x \geq 1/2 \\ T-1 &\text{if} \quad x< 1/2 \end{array} \right. \end{equation}.
We also know that when $x \neq 1/2$, $\sqrt{n}(\hat{T}-T) \overset{{\strut\text{D}}}\rightarrow \mathcal{N}(0,Var)$. Can we say the same for $S$ and $\hat{S}$? The hat denotes the estimator, all the considered statistics depend only on $x$.
If $X_n \overset{D}{\to} X$ then $-X_n \overset{D}{\to} -X$. I think you can prove this directly using one of the many equivalent definitions of convergence in distribution, or use a hammer like Slutsky's theorem.
Thus $\sqrt{n}((-\hat{T}) - (-T)) \overset{D}{\to} \mathcal{N}(0, V)$.