Let $E(f_{i}^{n})$ and $E(s_{i}^{n})$ denote the expected first and second order statistics for $n$ draws from the distribution $V_i$ .i.e set $X_{i}^{n}=\{x^1,.....,x^n | x^j \sim V_i \}$ and let $f_{i}^{n}$ be the minimum value of $X_{i}^{n}$ and $s_{i}^{n}$ the second minimum value.
I would like your help with understanding if there's a mathematical $O(1)$ complexity way for computing the above expectations of the second and first order statistic in any type of distribution: should I refer this computing as a fixed time action in terms of complexity or is it $O(n)$ in some distributions?
In general, if the sample has CDF $F$ and PDF $f$, then the distribution of the kth order statistic is
$$P(X_{k,n}\leq x) =n {n-1 \choose k-1} \int_{0}^{F(x)}t^{k-1}(1-t)^{n-k}\, dt,$$
with density
$$f_{k,n}(x) =n {n-1 \choose k-1} F(x)^{k-1}[1-F(x)]^{n-k}f(x).$$
If the underlying distribution is uniform, then
$$E(X_{k,n})=\frac{k}{n+1}.$$
As a less trivial example, suppose the underlying distribution is exponential: $f(x) = e^{-x}.$ Then
$$E(X_{k,n})=n {n-1 \choose k-1}\int_{0}^{\infty}xe^{-x}(1-e^{-x})^{k-1}(e^{-x})^{n-k} \, dx\\=n {n-1 \choose k-1}\int_{0}^{1}(\log u)(1-u)^{k-1}u^{n-k} \, dx,$$
which can be integrated in closed form. This may not be possible for other common distributions, e.g. normal distribution. Nevertheless, the problem is just evaluating an integral analytically or numerically for known distribution.
Otherwise, Monte Carlo simulation is necessary - the estimator for the first order statistic is the smallest observation. The error can be estimated only in terms of probability using a tolerance interval constructed from the estimates of higher order statistics. Low order statistics will require a very large number of samples to estimate with a reasonable level of confidence.