Given a binary vector $\mathbf{v}$, where $\mathbf{v} \in \{0,1\}^N$ and the binary-to-decimal conversion of $\mathbf{v}$ is equal to $j$, is there a way to linearly map the vector $\mathbf{v}$ to a binary vector $\mathbf{e}_j$ with $\mathbf{e} \in \{0,1\}^{2^N}$, and its $(j+1)$-th element as $1$ (starting the index from zero)?
For example if $\mathbf{v} = [1 \quad 0]$ (that represents 2 in decimal) how $\mathbf{v}$ can be linearly mapped to $\mathbf{e}_3$, that is, $\mathbf{e}^\mathrm{T} = [0\quad 0 \quad 1 \quad 0]$.
The use case is in the objective function of an ILP.
If by "linearly map" you mean a real matrix $H$ such that $\mathbf{e} = H \mathbf{v}$ yields what you want, the answer is no. If $$\left(\begin{array}{c} 0\\ 0\\ 1\\ 0 \end{array}\right)=H \left( \begin{array}{c} 0\\ 1 \end{array}\right)$$ (encoding of 2) and $$\left(\begin{array}{c} 0\\ 1\\ 0\\ 0 \end{array}\right)=H \left( \begin{array}{c} 1\\ 0 \end{array}\right) $$(encoding of 1), then what you get for the encoding of 3 is$$\left(\begin{array}{c} 0\\ 1\\ 1\\ 0 \end{array}\right)=H \left( \begin{array}{c} 1\\ 1 \end{array}\right) = H \left( \begin{array}{c} 1\\ 0 \end{array}\right) + H \left( \begin{array}{c} 0\\ 1 \end{array}\right),$$whereas you want$$\left( \begin{array}{c} 0\\ 0\\ 0\\ 1 \end{array}\right).$$
That said, the conversion from $\mathbf{v}$ to $\mathbf{e}$ can be handled by linear constraints. Treating the components of both vectors as binary variables, you just need$$\sum_{i=1}^{2^N}e_i \le 1$$and $$\sum_{i=1}^{2^N} i \cdot e_i = \sum_{j=1}^N 2^{j-1} \cdot v_j.$$