I would like to know if there is the definition of something as consistency and efficiency for vector estimator.
Probably the definition of consistency can be generalized, it means that $\forall \varepsilon>0: \lim\limits_{n\to\infty}P\left(\{\omega \in \Omega: ||T_n-\theta||<\varepsilon\}\right)=0$, where $||\cdot||$ is Euclidean norm, $\theta$ is $k$-dimensional vector of parameters and $T_n$ is vector estimator of $\theta$.
But there is the theorem for one-dimensional parameter $\theta$ which says: Consider that $E(T^2_n)<\infty$ for each $n\in\mathbb{N}$. Moreover $\lim\limits_{n\to\infty}E(T_n)=\theta$ and $\lim\limits_{n\to\infty}D(T_n)=0$, then one-dimensional estimator $T_n$ is consitent estimator of one-dimensional pramater $\theta$.
Is there something similar for vector estimator $T_n$ of vector parameter $\theta$?
And moreover can be difient something like efficiency for vector estimator? I know that Rao-Cramer bound exist also for vector estimator so maybe for unbiased vector estimator the efficiency can be defined as $I(\theta)^{-1}var^{-1}(T_n)$, but this expression is matrix. How is it "measure efficiency"?
If something like this exists, can you recommend some literature? I cannot find any relevant source in the internet.
Any help will be appreciated.