If I have a set of $r$ functions (or a distribution) in $n$ variables $f_1(x_1, x_2, \ldots, x_n), f_2(x_1, x_2, \ldots, x_n), \ldots, f_r(x_1, x_2, \ldots, x_n)$ and I can prove linear independence of $f_1, \ldots f_r$ when one of the variables (say $x_1=0$ is set to zero), then can I argue that the functions are linearly independent for ALL values of $x_1$? Does linear independence at a specialized value implies linear independence at all values?
More concretely in the simpler case of two variables:
If $$a_1 f_1(x,0) + b_2 f_2(x,0) = 0 \implies a_1, b_1 =0 \quad \forall x \in \mathbb{R}$$Does this imply, $$a_1 f_1(x,y) + b_2 f_2(x,y) = 0 \implies a_1, b_1 =0? \quad \forall x,y \in \mathbb{R}$$
I would post this as a comment if I could. And I am not sure if I understood the way you used the quantifier "$\forall$".
By definition of linear independence of two real-valued functions $f_{1},f_{2}$ whose domain I am ignoring, as long as you show that the coefficients $a_{1}$ and $b_{2}$ are both zero it becomes irrelevant if you used all the information about the domain of those functions or not. To illustrate, consider $f_{1}(x,y) = f_{2}(x,y)=x+y$. There is obviously some linear dependence here, but note that you can prove this directly by looking at $(0,1)$ and $(0,-1)$: $a_{1}f_{1}(0,1)+b_{2}f_{2}(0,1) = a_{1} + b_{2} = 0 = a_{1}-b_{2} =a_{1}f_{1}(0,-1)+b_{2}f_{2}(0,-1) \Rightarrow a_{1}=b_{2}=0$.
But since you are interested in a different concept of linear independence that depends on each part of the domain of the functions, the answer is no. Take for instance the real-valued functions on $\mathbb R^{2}$ $f_{1}(x,y) = 1+ x, f_{2}(x,y)=x+y$. They satisfy your condition for "linear independence" given $x=0$, but not given $x=-1$.