I have a matrix-vector multiplication code to run $$ b^TAb=c $$ where $b$ is a vector to get a scalar number $c$.
However, the construction of square matrix $A$ is very expensive, so I wonder is it possible to compute the matrix $A$ if we know both $b$ and the constant $c$ to reduce the computation time. Because in my case I can approximate the $c$ value beforehand.
I tried right multiplying a $b.T$ to the equation and got $$ b^TA=\frac{c}{|b|^2}b^T $$ so one very trivial solution is that $A$ is a diagonal matrix with $\dfrac{c}{|b|^2}$ the diagonal elements. I do not know how to obtain the general solution for this question. What techniques should be used in solving for matrix $A$? And what does the complete solution set of the matrix $A$ look like?
If you have a single value of $c$ associated with some vector $b$ you do not have enough information to construct $A$.
The quadratic form $b^\intercal A b $ involves the sum of multiple products and the elements of $A$ act as the coefficients. The only thing you can take advantage is that usually, $A$ is symmetric which means you only need to determine $n (n+1)/2$ terms to fully define it.
In the simplest form of $A$ as 2×2 matrix, you expand out the calculation to get
$$ c = \pmatrix{b_1 \\ b_2}^\top \begin{bmatrix} A_{11} & A_{12} \\ A_{12} & A_{22} \end{bmatrix} \pmatrix{b_1 \\ b_2} $$
$$ c= A_{11} b_1^2 + 2 A_{12} b_1 b_2 + A_{22} b_2^2 $$
So you are asking if you can determine $A_{11}$, $A_{12}$ and $A_{22}$ from the above given $c$, $b_1$ and $b_2$?
The answer is no since you have 1 linear equation (in terms of $A$ terms) and three unknowns.
You would need at least three conditions (call them A,B,C) to form the following set of equations
$$ \begin{aligned} c^A & = A_{11} (b_1^A)^2 + 2 A_{12} b_1^A b_2^A + A_{22} (b_2^A)^2 \\ c^B & = A_{11} (b_1^B)^2 + 2 A_{12} b_1^B b_2^B + A_{22} (b_2^B)^2 \\ c^C & = A_{11} (b_1^C)^2 + 2 A_{12} b_1^C b_2^C + A_{22} (b_2^C)^2 \\ \end{aligned} $$
to be solved for $A_{11}$, $A_{12}$ and $A_{23}$ as a 3×3 system.