Let $M$ be a differential manifold of dimension $m$ and let $\omega$ be a $1$-form on $M$. Prove that if $\omega$ is exact, i.e. there is some smooth function $f:M \to \mathbb{R}$ such that $$\omega = \sum_{i=1}^m \frac{\partial f}{\partial x_i} dx_i, $$ for every coordinate chart of $M$ (in which we denoted the coordinates by $x_1, \dots, x_m$), then for every $X,Y \in \mathfrak{X}(M)$, we have that $$\omega([X,Y]) = X(\omega(Y)) - Y(\omega(X)), $$ where $X(f)$ is the Lie derivative of $f$ along $X$.
I've tried proving this in a coordinate chart. I'll work with coordinates $x_1, \cdots, x_m$. Then, we have that $$X = \sum_{i=1}^m X^i \frac{\partial}{\partial x_i}, Y = \sum_{i=1}^m Y^i \frac{\partial }{\partial x_i} $$ and that $$[X,Y] = \sum_{i=1}^m(X(Y^i) - Y(X^i))\frac{\partial}{\partial x_i}. $$ We also have that $$\omega([X,Y]) = \sum_{i=1}^m \frac{\partial f}{\partial x_i} \cdot (X(Y^i) - Y(X^i)) $$ and that $$\omega(X) = \sum_{i=1}^m \frac{\partial f}{\partial x_i} X^i, \omega(Y) = \sum_{i=1}^m \frac{\partial f}{\partial x_i}Y^i,$$ and since the Lie derivative is linear and is a derivation, we get that $$X(\omega(Y)) = \sum_{i=1}^m \left( Y^i \cdot X \left(\frac{\partial f}{\partial x_i} \right) + \frac{\partial f}{\partial x_i} \cdot X(Y^i)\right) $$ and a similar formula can be found for $Y(\omega(X))$. However, if I compute their difference, to prove what we need we would need to have that $$ \sum_{i=1}^m Y^i \cdot X \left( \frac{\partial f}{\partial x_i} \right) = \sum_{i=1}^m X^i \cdot Y\left( \frac{\partial f}{\partial x_i} \right), $$ which by expanding $X$ and $Y$ is not true. How should I proceed from here? I remark that I have never used the fact that $\omega$ is exact, since $\displaystyle \frac{\partial f}{\partial x_i}$ could have been replaced by some function $f_i$ and there would be no difference.
Since both sides of the equality $$\omega([X,Y])=X\omega(Y)-Y\omega(X)$$ are $\mathbb{R}$-linear w.r.t. $X, Y$, it suffices to prove the result for $X=\alpha \partial_i$ and $Y=\beta \partial_j$, where $\alpha, \beta$ are real functions.
We have that $\omega= df$ for some $f$. Therefore, the left side is \begin{align*} df(\alpha\partial_i(\beta \partial_j)-\beta\partial_j(\alpha \partial_i))&=df(\alpha \partial_i \beta \partial_j+\alpha\beta\partial_i\partial_j-\beta\partial_j \alpha\partial_i-\beta\alpha\partial_j\partial_i) \\ &= \alpha\partial_i\beta\partial_j f-\beta\partial_j\alpha \partial_i f. \end{align*} The right side is \begin{align*} \alpha\partial_i(df(\beta\partial_j))-\beta\partial_j(df(\alpha\partial_i))&=\alpha\partial_i(\beta\partial_j f)-\beta\partial_j(\alpha\partial_if) \\ &=\alpha\partial_i\beta\partial_jf+\alpha\beta\partial_i\partial_jf-\beta\partial_j\alpha\partial_i f-\beta\alpha\partial_j\partial_if \\ &=\alpha\partial_i\beta\partial_jf-\beta\partial_j\alpha\partial_if, \end{align*} and thus both sides coincide.
This result also follows from the more general equality $$d\omega(X,Y)=X\omega(Y)-Y\omega(X)-\omega([X,Y]),$$ since an exact form is in particular a closed one, and thus we have that $0=X\omega(Y)-Y\omega(X)-\omega([X,Y])$ in your case.
Since both sides are $\mathbb{R}$-linear wrt $\omega, X$ and $Y$, it suffices to show that the relation is true when $\omega=fdg$, $X=\alpha \partial_i$ and $Y=\beta \partial_j$, where $f,g,\alpha,\beta$ are real functions.
Now, the left side is \begin{align*} df\wedge dg (\alpha \partial_i, \beta \partial_j)&=df(\alpha \partial_i)dg(\beta \partial_j)-dg(\alpha \partial_i)df(\beta \partial_j) \\ &=\alpha\beta(\partial_i f \partial_j g-\partial_i g\partial_j f). \end{align*} The right side is \begin{align*} &\alpha \partial_i(fdg(\beta \partial_j))-\beta\partial_j(fdg(\alpha \partial_i))-fdg(\alpha \partial_i(\beta \partial_j)-\beta\partial_j(\alpha \partial_i)) \\ &=\alpha \partial_i(f \beta \partial_jg)-\beta\partial_j(f\alpha\partial_i g)-fdg(\alpha (\partial_i\beta\partial_j+\beta\partial_i\partial_j)-\beta(\partial_j\alpha\partial_i+\alpha \partial_j\partial_i) \\ &=\alpha \partial_i(f \beta \partial_jg)-\beta\partial_j(f\alpha\partial_i g)-fdg(\alpha \partial_i \beta \partial_j-\beta \partial_j \alpha \partial_i) \\ &= \alpha \partial_i(f \beta \partial_jg)-\beta\partial_j(f\alpha\partial_i g)-f (\alpha \partial_i \beta \partial_j g-\beta \partial_j \alpha \partial_ig) \\ &=\alpha \partial_i (f\beta) \partial_jg +\alpha\beta f\partial_i\partial_jg- \beta \partial_j(f\alpha)\partial_i g - \alpha \beta f \partial_j\partial_ig -f\alpha\partial_i\beta\partial_j g+f\beta\partial_j\alpha\partial_i g\\ &=\alpha \partial_i (f\beta) \partial_jg - \beta \partial_j(f\alpha)\partial_i g -f\alpha\partial_i\beta\partial_j g+f\beta\partial_j\alpha\partial_i g\\ &=\alpha \beta\partial_i f\partial_jg+f\alpha\partial_i\beta\partial_jg-\alpha\beta\partial_jf\partial_i g - f\beta\partial_j\alpha\partial_i g -f\alpha\partial_i\beta\partial_j g+f\beta\partial_j\alpha\partial_i g \\ &=\alpha\beta(\partial_i f\partial_j g-\partial_i g\partial_j f), \end{align*} and thus both are equal.
You can cut off a lot of manual labor from the proof above by proving that both sides are also $C^{\infty}$-linear w.r.t $X$ and $Y$. The left side is immediate, but the right side requires some work, albeit not much. Indeed, \begin{align*} fX\omega(Y)-Y\omega(fX)-\omega([fX,Y]) &= fX\omega(Y)-(Yf)\omega(X)-fY\omega(X) \\ &~~~~-\omega(fXY-(Yf)X-fYX) \\ &=fX\omega(Y)-fY\omega(X)-\omega(f(XY-YX)) \\ &=f(X\omega(Y)-Y\omega(X)-\omega([X,Y])). \end{align*} Similar computations obviously hold for $Y$.
Now, we can let $\omega=fdg$ and $X=\partial_i, Y=\partial_j$, and the work amounts to showing that the left side, which is $$\partial_i f\partial_jg-\partial_j f\partial_i g $$ is equal to the right side, which is just $$X\omega(Y)-Y\omega(X)=\partial_i(f\partial_jg)-\partial_j(f\partial_i g) $$ this time, since $[X,Y]=0$. Opening the right side a bit by making \begin{align*} \partial_i(f\partial_jg)-\partial_j(f\partial_i g) &=\partial_i f\partial_jg+f\partial_i\partial_j g-\partial_j f \partial_i g-f\partial_j\partial_i g \\ &= \partial_i f\partial_jg-\partial_j f \partial_i g \end{align*} yields the result.