This is a homework question in the book named probability and computing.
$13.9$ : suppose we are given m vectors $\overrightarrow v_1, \overrightarrow v_2 , \cdots, \overrightarrow v_m \in \{ 0, 1\}^l$ such that any $k$ of the $m$ vectors are linearly independent modulo $2$ . Let $\overrightarrow v_i = (v_i,_1 , v_i,_2 , \cdots, v_i,_l)$. Let $\overrightarrow u$ be chosen uniformly at random from $\{ 0, 1\}^l$, and let $Xi = \sum_j_=_1^lv_i,_ju_j$ mod $2$, show that the $X_i$ are uniform, $k$-wise independent bits.
Firstly, I doubt that whether $X_i$ is uniform, suppose $l$ assumes $1$, so $X_i = v_i,_1u_1$, and $v_1,_1$ has equal probability $(1/2)$ to be $0$ or $1$, likewise $u_1$. $$ \begin{array}{c|lcr} & 1 & 0\\ \hline 1 & 1 & 0 \\ 0 & 0 & 0 \\ \end{array} $$ the matrix tells us $X_i$ has probabilty $1/4$ to be $1$ and $3/4$ to be $0$, so it's not uniform, so the conclusion in the question is wrong. I want to know whether my analysis is reasonable and then if I am wrong, how to derive the proof? thanks
After several days of trying, I work it out ;-)
Firstly, I should clarify my misunderstanding of this problem, $\overrightarrow v_i$ is not a variable, which is given in the condition , so the conclusion $v_i,_j$ has the probability $1\over2$ to be 0 or 1 is wrong.
Next I will post my solution to this problem
(1) unifom:
how to show that $Xi = \sum_j_=_1^lv_i,_ju_j$ mod 2 is uniform? Firstly, for $\overrightarrow v_i$ is k-wise indpendent, so it can not be like this [0...0] (all are zero), otherwise it's contradictory, here we use deffered discisions. suppose $v_i,_j$ is 1, and we reveal any position except j in $\overrightarrow u$, then whether $X_i$ is 0 or 1 is decided by the value of $u_j$, which is unifom between 0 and 1.
(2) k-wise indepent: Here we prove it by induction:
consider two vector $\overrightarrow v_1, \overrightarrow v_2$ and $\overrightarrow u_1 , \overrightarrow u_2$ then $X_1 = \sum_j_=_1^lv_1,_ju_1,_j$, $X_2 $ =$\sum_j_=_1^lv_2,_ju_2,_j$.
X1 = c, X2 = d and c, d $\in$ {0,1}, we should prove that P(X1 = c $\cap$ X2 = d ) = 1/2.
we reveal $\overrightarrow u_2$, suppose position m in $\overrightarrow v_1$ is 1, then we also reveal other position in $\overrightarrow v_1$, this is also known as principle of deffered dicision, so we can determine X2. And value of X1 is decided by $u_1,_m$ which is uniform between {0,1}. So P(X1 = c $\cap$ X2 = d | X2 = d) = 1/2, because of P(X2 = d) = 1/2, P(X1 = c $\cap$ X2 = d) = 1/4
deduction should prove that P($\cap X_i = a_i$) = $1\over 2^k$, suppose k-1 vectors hold that ,then introduce a new vector $\overrightarrow v_k$, analyse like 1. then it gets proved.
(Note : perhaps you will ask that , you don't use the predicate $\overrightarrow v_i$ is k-wise independent. Indeed I take advantage of it, in step 2, suppose we reveal the k-1 vectors, if $\overrightarrow v_i$ is n-wise independent where n < k , then $\overrightarrow v_k$ is determined. However in step 2, $\overrightarrow v_k$ is not determined and we then reveal partial of $\overrightarrow v_k$ except $m$ position is based on the condition.)
end.