Since every vector space has a basis, what is the coordinate system or basis used when describing Basis Vectors?
For example: Is the basis vectors $B=\{ (1,2),(3,5)\}$ defined in terms of $\hat i$ and $\hat j$? As in:
$b_1=1 \cdot\hat i+2\cdot \hat j$
$b_2=3 \cdot\hat i+5\cdot \hat j$
Let $V$ be an finite dimensional vector space over a field $F$, and let $B = \{\hat v_1,...,\hat v_n\}$ be the basis vectors for that vector space.
What you are saying is true. If $k \in V$, this means that we can write $k$ as:
$$k = (k_1,...,k_n) :=\sum_{i=1}^n k_i \cdot \hat v_i$$
But we also have that, for some $j \in 1,...,n$, then $\hat v_j \in V$. So this measn that we can write this as:
$$\hat v_j = 0 \cdot \hat v_1\ +\ ...\ +\ 1 \cdot \hat v_j \ +\ ... \ +\ 0 \cdot \hat v_n$$
Or, if you are familiar with the Kronecker delta:
$$\hat v_j = \sum_{i=1}^n \delta_{ij}\cdot \hat v_i$$
So this means that the basis vector are defined in terms....... Of themselves!? This seams like a paradox: what we are doing is the equivalent as trying to explain, for example, to someone what a computer is and saying "A computer is a computer", but in this case we are saying, in mathematical language: "$\hat v_j = 1 \cdot \hat v_j$". This gives us an idea of how fundamental the concept of basis for a vector space is: there is no basis for the basis of the vector space, they are their own basis. Basis for vector spaces are so fundamental that we just define them to be the way they are, like we do with constants or axioms. There's nothing more "simple" or "fundamental" that we can use to express the basis vectors. Of course that you can say that, for example if we are doing a change of Basis we are able to express the new basis in terms of the old one, but, in the vector space where the new basis "lives", that basis will always be defined as $(1,0,0)$ or $(0,0,0,1,0,0)$ or whatever.