Let $V$ be $n$-dimensional vector space and $\{x_1,\cdots,x_n\}$ a basis. Let $\beta:V\times V\rightarrow F$ be a non-degenerate (symmetric) bilinear form. This implies that there exists a dual basis $\{y_1,\cdots, y_n\}$ of $V$ w.r.t. $\beta$, i.e. $$\beta(x_i,y_j)=\delta_{ij}.$$ Can we write expression for basis elements $y_i$'s in terms of $x_i$'s and matrix of $\beta$ w.r.t. basis $\{x_1,\cdots, x_n\}$?
This is an obvious computational question and may be very trivial, but I didn't find in any book mentioning about computation of dual basis. For example, as a concrete example, let $V$ be the space of column vectors of length $n$ over field $\mathbb{R}$. Consider dot product on this column space. Then given any invertible $n\times n$ matrix $A$ over $\mathbb{R}$, its $n$ columns define a basis of $V$; what is dual basis w.r.t. dot product? Perhaps it is columns of $(A^{-1})^t$, am I right? Then next question comes more general, which I put above.
Since $\beta : V \times V \to F$ is given, we know that the value of $\beta$ is known for any given input $(v,w) \in V \times V$. So in principle the matrix $[\beta(x_i,x_j)]$ is determined.
Let $y_i = \sum_{k} A_{ik}x_k$, we have $$ \delta_{ij} = \beta(x_i,y_j) = \beta \Big(x_i,\sum_{k}A_{jk} x_k\Big) =\sum_{k} A_{jk} \beta(x_i,x_k) = \sum_{k}A_{jk}\beta_{ik} =\sum_k A_{jk}\beta_{ki}. $$ This read in matrix notation as $$[A_{ik}][\beta_{kj}] = I \implies [A_{ij}]=[\beta_{ij}]^{-1} . $$