In the proof of $d(d\omega)=0$ for all k-form $\omega=f_Idx^I=f_I \wedge dx^I$. We compute $$ d(d\omega)=d(d(f_I \wedge dx^I))=df_I\wedge d(dx^I) + d(df_I)\wedge dx^I=df_I\wedge d(dx^I)=0 $$
Where the third equality is because the double exterior derivative of any 0-form is zero since mixed partial commutes. Now in the last equality, the proof goes:
$$ d(dx^I)=d(1)\wedge dx^I=0 $$
But I don't understand why $d(dx^I)=d(1)\wedge dx^I$. I know that $dx^I$ will be the sum of the wedge of k number of 1-forms but I don't think that is relevant here.
If I understand you correctly you are using the following definition of exterior differentiation. The differential operator $$ d: \Omega^{q}\left(\mathbb{R}^{m}\right) \rightarrow \Omega^{q+1}\left(\mathbb{R}^{m}\right) $$ defined as follows:
If this is the case then we just note that $dx^I=1\cdot dx^I$, where $1\in C^{\infty}(\mathbb{R}^m)$ is just the constant function. It follws from the definition of the directional derivative that $\frac{\partial 1 }{\partial x_i}=0$ for all $1\le i \le m$. Thus when computing $d(dx^I)$ we get $$\sum_{i=1}^m\frac{\partial 1 }{\partial x_i}dx_i \wedge dx^I=0.$$ This is what we wanted to prove.