According to Wikipedia, a bilinear form $B$ on a (possibly infinite-dimensional) vector space $V$ is defined as nondegenerate if the map
$$B^\flat:V\ni x\mapsto B(x,\cdot)\in V^*$$
is an isomorphism onto its image, i.e., injective.$^\dagger$ It seems natural to also consider whether the map
$$B^\flat{'}:V\ni y\mapsto B(\cdot,y)\in V^*$$
is injective. If $B$ is not assumed to be symmetric, are these two conditions equivalent? Please give a proof or counterexample.
Edit: In this answer, it was originally claimed that the implication is false, but then after an exchange in the comments here it was updated to reflect the fact that $B^\flat$ can never be a full isomorphism if $V$ is infinite dimensional. I think this leaves the question I'm asking here unanswered still.
$^\dagger$Actually, the above-linked Wiki article is a bit vague and even suggests that $B^\flat$ must be an isomorphism onto all of $V^*$, which—as Marc van Leeuwen points out in the comments—is impossible whenever $V$ is infinite dimensional. However, this section of a different Wiki article suggests that the correct definition is as stated above.
I don't exactly know what is the final definition of non-degeneracy after the comments, so I will use the one given by the OP.
So, let $B$ be a bilinear form, $e_i$ be a basis for your vector space and $a_{ij}=B(e_i,e_j)$, so if $x=\sum x^ie_i$, $y= \sum y^je_j$, where only finitely many $x^i$ and $y^j$ are nonzero, $$B(x,y)=B \left( \sum_i x^i e_i , \sum_j y^j e_j\right)=\sum_{i,j}a_{ij}x^iy^j$$ Note that the sum is finite because only finitely may of the $x^i$, $y^j$ are non-zero.
Let's briefly jump back to the finite dimensional case. The equation above can be written in matrix form:
$$B(x,y) = \bf x A y^t$$
Where $\bf A$ is the matrix formed by the $a_{ij}$, $\bf x$ and $\bf y$ are the (row) vectors formed by the $x^i$ and $y^j$ respectively, which will be identified with $x$ and $y$ . Note that one can recover $B$ from the matrix $\bf A$. Then, your maps are
$$ \flat x \mapsto \text{the map sending } y \text{ to } x \textbf{A} y^t $$ $$ \flat' y \mapsto \text{the map sending } x \text{ to } x \textbf{A} y^t $$
and my claim (which is more or less obvious) is that:
Therefore, $\flat$ is injective if for all $x$ different than $0$, $x \textbf{A}$ is different than $0$. This is the definition of regularity: A matrix $M$ is regular by the left if $NM=0$ implies $N=0$.
In a similar fashion we prove the analogue result for $\flat '$, and therefore
Now, we have a theorem in linear algebra saying that a matrix is regular by the ñeft iff it is regular by the right iff it is invertible. Therefore, in finite dimensions, $\flat$ is injective iff $\flat '$ is injective.
How do I carry the same argument in infinite dimension? Let $\{e_i\}_{i \in I}$ be a basis for your vector space. In this case, $I$ is an infinite set. If you want to visualize it, think of $I = \mathbb N$. Now, a row vector will be a collection of scalars $(x^i)_{i \in I}$, such that only finitely many of them are nonzero, and it represents the vector $\sum_i x^i e_i$. A matrix will be any set of scalars $a_{ij}$ indexed by $(i,j) \in I \times I$. In this setting, things like $(x^i ) \cdot (a_{ij})$ or $(x^i) \cdot (a_{ij}) \cdot (y^j)^t$ can be made have sense. for instance, the first one would be a uple indexed by $I$ whose $j$-th entry is $\sum_i x^ia_{ij}$, but it is not a row vector because it may have infinitely many nonzero entries.
Anyways, the same ideas as before can be applied: $B$ can be associated with a matrix $\textbf{A}=(a_{ij})$, $\flat $ ( resp. $\flat '$) is injective iff $\bf A$ is regular by the left (resp. right).
However, in infinite dimension they are not equivalent. This is the equivalent to saying that in infinite dimension one can have an endomorphism which is injective but not surjective. The classical example is the shift operator, which is defined by the (infinite) matrix over $\mathbb N$ having $0$ everywhere except $1$'s on top of the diagonal. Now, you can try to formalize all of this, and I won't blame you for trying it, but I think it is more important to convince yourself that it actually works.
However, as you said, I will give you the explicit construction. Let $V$ be the vector space consisting of uples $(a_1, a_2, \ldots)$ where $a_i \in k$ and only finitely many of the $a_i$ are nonzero. Let $B\left( (a_i), (b_i)\right)$ be equal to $\sum_{i=1}^\infty a_{i}b_{i+1}$. You can check easily that it is a bilinear form, that $B(x,\cdot)$ is never $0$ if $x$ is nonzero but that if $y=(1,0,0, \ldots)$ then $B(\cdot , y)$ is identically $0$.