Limiting distribution of $X_n1(|X_n|\le 1-\frac{1}{n})+n1(|X_n|>1-\frac{1}{n})$ if $X_n\sim Unif(-1,1)$ and are iid.
From looking at the term, if $n$ goes to infinity, then $Y_n$ would be $X_n$ so it should have the same distribution as $X_n$.
$\displaystyle\psi_{Y_n}(t)=E[e^{itY_n}]=E[e^{itX_n1(|X_n\le 1-\frac{1}{n})}]E[e^{itn1(|X_n|>1-\frac{1}{n})}]=\left(\int_{-1+1/n}^{1-1/n}e^{itx}\frac{1}{2}dx\right)\left(\int_{-1}^{-1+1/n}e^{itn}\frac{1}{2}dx+\int_{1-1/n}^1\frac{1}{2}e^{itn}dx\right)=\frac{\exp{(it(1-\frac{1}{n}+n))}+\exp(-it(1-\frac{1}{n}-n)}{2itn}$
So I get $Y_n\sim Unif(1-\frac{1}{n}-n, 1-\frac{1}{n}+n)$
My problem is as $n\rightarrow\infty$, this characteristic function doesn't converge, instead it diverges to infinity.
Assuming that you mean $$Y_n=X_n1(|X_n|\le 1-\tfrac{1}{n})+n1(|X_n|>1-\tfrac{1}{n}),$$ one certainly does not get that $Y_n\sim\mathrm{Unif}(1-\frac{1}{n}-n, 1-\frac{1}{n}+n)$. Rather, your own computations yield $$ \psi_{Y_n}(t)=\frac1t\sin(a_nt)+\frac1n\mathrm e^{\mathrm itn},\qquad a_n=1-\frac1n. $$ When $n\to\infty$, $a_n\to1$ and $\frac1n\to0$, hence $$ \lim_{n\to\infty}\psi_{Y_n}(t)=\frac{\sin(t)}{t}=\psi_Y(t), $$ where $Y\sim\mathrm{Unif}(-1,1)$ and the conclusion follows.
As was to be expected, the independence property is irrelevant.
Edit: More direct arguments exist (but the one above follows yours as much as possible). For example, consider some random variable $X\sim\mathrm{Unif}(-1,1)$, then $Y_n$ coincides in distribution with $$ Z_n=X1(|X|\le 1-\tfrac{1}{n})+n1(|X|>1-\tfrac{1}{n}). $$ Since $Z_n\to X$ almost surely, $Z_n\to X$ in distribution, hence $Y_n\to X$ in distribution.