Let $V={\mathbb R}^d$ and
$$ A=\bigg\lbrace (v_1,v_2) \in V \times V \bigg| \ v_1 \ \text{and} \ v_2 \ \text{are linearly independent} \bigg\rbrace $$
Consider the maps $f:A \to {\mathbb R}$ satisfying the following three properties :
$$ f(a,b_1+b_2)=f(a,b_1)+f(a,b_2) \ \text{whenever} \ (a,b_1),(a,b_2),(a,b_1+b_2)\in A \tag{1} $$
$$ f(a_1+a_2,b)=f(a_1,b)+f(a_2,b) \ \text{whenever} \ (a_1,b),(a_2,b),(a_1+a_2,b)\in A \tag{2} $$
$$ f(\lambda a,\mu b)=\lambda \mu f(a,b) \ \text{whenever} \ \lambda\mu \neq 0, \ (a,b)\in A \tag{3} $$
Can $f$ always be extended to a bilinear map $V \times V \to {\mathbb R}$ ?
The answer is YES. Indeed, let ${\cal U}=(u_1,u_2, \ldots ,u_d)$ be a basis of $V$, and let ${\cal U}=(v_1,v_2, \ldots ,v_d)$ be another basis of $V$ such that $(u_i,v_j) \in A$ for any $i,j$. For example, we could take $v_j=S-u_j$ where $$S=\sum_{k=1}^d u_k.$$
There is a unique bilinear map $h: V \times V \to {\mathbb R}$ such that $h(u_i,v_j)=f(u_i,v_j)$. So all we need to show is that $f$ coincides with $h$ everywhere. Let $(u,v)\in A$ ; we are going to show that $f(u,v)=h(u,v)$. We will treat first the case when $u$ is not collinear to any of the $v_k$.
Lemma. Let $(u,v)\in A$ such that $u$ is not collinear to any of the $v_k$. Decompose $v$ in $\cal V$ : $v=\sum_{k=1}^d t_kv_k$, where $t_1,t_2, \ldots,t_r$ are real numbers. Then, there is a permutation $\sigma$ of $\lbrace 1,2,3, \ldots d\rbrace$, such that all the partials sums $\sum_{k=1}^{j}t_{\sigma (k)}v_{\sigma (k)}$ (for $1\leq j \leq d$) are either zero or non collinear to $u$.
Proof of lemma If taking the identity permutation for $\sigma$ does not work, this means that $\sum_{k=1}^{j_0}t_{k}v_{k}$ is a nonzero multiple of $u$ for some $j_0$. Since $v$ is not a multiple of $u$, we must have $j_0<d$ and we must have a $j_1>j_0$ such that $t_{j_1}\neq 0$. Let $\tau$ be unique permutation of $\lbrace 1,2,3, \ldots d\rbrace$ such that $\tau(1)=j_1$ and $\tau$ is increasing on $\lbrace 2,3, \ldots d\rbrace$. Then, taking $\sigma=\tau$ works.
Now let $i\in \lbrace 1,2,3, \ldots d\rbrace$ ; $u=u_i$ satisfies the hypothesis of the lemma. Let $\sigma$ be as in that lemma. By induction on $j$, we have for any $1 \leq j \leq d$, $$f(u,\sum_{k=1}^{j}t_{\sigma(k)}v_{\sigma(k})=\sum_{k=1}^{j} t_{\sigma(k)}h(u,v_{\sigma(k})$$ So for $j=d$, we deduce that $f(u_i,v)=h(u_i,v)$. By symmetry, we also have $f(u,v_j)=h(u,v_j)$ whenever $u$ is not collinear to any of the $v_j$. We can then apply the lemma a second time, and deduce that $f(u,v)=h(u,v)$ whenever $u$ is not collinear to any of the $v_j$. By symmetry again, we also have $f(u,v)=h(u,v)$ whenever $v$ is not collinear to any of the $u_j$.
So the only case left is when $u$ is a multiple of some $v_{i}$ and $v$ is a multiple of some $u_{j}$ ; in other words, it remains to show that $f(v_i,u_j)=h(v_i,u_j)$. Now, if we put $U=\sum_{k=1}^d u_k$, by the above result $f$ coincides with $h$ on $(v_i,U)$ and $(v_i,u_j-U)$, and hence also on $(v_i,u_j)$, as wished.