This question seems so obvious that I wonder if I am just having a brain-melt moment... if so, apologies in advance! :(
The difference between pairwise independence and mutual independence is well known, and is often demonstrated with this classic set of $3$ events: "$1$st coin = H", "$2$nd coin = H", and "even number of Hs". (All coins are fair and independent. $0$ is even.)
This can be generalized to $K > 2$ coins, with the $(K+1)$th event still being "even number of Hs". In this set of $K+1$ events, any subset of $K$ events is mutually independent, but the full set is not. (If you wish, read more here.)
However, I have a hard time adding just one more event. Specifically, I seek: For some $K\ge 2$:
a set of $K+2$ events,
where any $K$-subset is mutually independent,
but any $(K+1)$-subset is not mutually independent.
I would prefer a small example using coins, but will accept any probability space. If this is impossible, a proof would also be great.
Attempts: If I add another independent coin, then some $(K+1)$-subset will be mutually independent. OTOH if I just add another Boolean formula (or $\mathbb{Z}_2$ formula) on the existing $K$ coins, then all the formulas that come to mind have some $K$-subset being mutually dependent.
I have also tried models where the actual coins are hidden (i.e. hidden variables), and the events (i.e. observables) are various combinations of the coins, but nothing I tried worked yet.
Further thoughts: In the classic example ($K\ge 2$ coins), the full $(K+1)$-sized set is not only dependent, but in fact conditioned on knowledge of any $K$-subset the remaining event is determined (not just dependent). This is not a requirement of my problem. However, if we add this as a requirement (making the problem harder but providing more structure that may inspire construction of an example), then this (vaguely) reminds me of various aspects of coding/decoding, which is why I am adding that tag.
I do not have a solution over $\mathbb Z_2$, but there is a solution with larger finite fields. More generally, given $n\ge k>0$, we can find $n$ random variables where any $k$ are independent, but any $k+1$ are dependent.
Let $P$ be a random polynomial of degree less than $k$ over the finite field of size $q$, where $q\ge n$. Specifically, $P$ is uniformly chosen among all $q^{k}$ such polynomials. Then writing the elements of this field as integers between $0$ and $q-1$, among the random variables $$ P(0),P(1),\dots, P(n-1), $$ any $k$ are independent, but any $k+1$ are dependent. This is the essence of the Reed-Solomon error correcting code.
To show $P(0),P(1),\dots,P(k-1)$ are independent, for example, you need to show that the probability that $$P(0)=y_0,P(1)=y_1,\dots,P(k-1)=y_{k-1}$$ is equal to $q^{-k}$, for any $y_0,\dots,y_{k-1}$ in the field. Since there are $q^k$ possible polynomials, this is equivalent to showing that there is a unique polynomial with degree less than $k$ which satisfies the above. This is indeed true, and can be found with Lagrange interpolation.