I'm reading this book that is talking about Rodrigue's rotation formula that says that to rotate a vector $\mathbf{v}$ about a rotation axis $\mathbf{n}$ and through an angle $\theta$, you need to multiply $\mathbf{v}$ by the matrix $\mathbf{R}(\mathbf{n}, \theta)$ given below, where $[\mathbf{n}]_\times$ is the matrix representing the linear transformation $\mathbf{w} \to \mathbf{n} \times \mathbf{w}$, i.e $$[\mathbf{n}]_\times = \begin{bmatrix} 0 & -n_z& n_y \\ n_z & 0 & -n_x \\ -n_y & -n_x & 0\end{bmatrix}$$.
My issue is with with the last part above. I am not able to derive that Jacobian matrix formula on my own $\frac{\partial \mathbf{Rv}}{\partial \mathbf{\omega}^T}$. Can you kindly give me a more direct derivation of that from the formula? This is from page 47 in Szeliski's book on Computer Vision, which you can find legally online in the author's website here.

For typing convenience, define the matrices $(V,W)$ as $$\eqalign{ V &= [v]_{\times} &\implies \,Vw = v\times w \\ W &= [w]_{\times} &\implies Wv = w\times v \\ Wv &= w\times v &\;=\; -v\times w = -Vw \\ }$$ Then what the last paragraph is saying is $$\eqalign{ \def\a{\approx} \def\p{\partial} &R \a (I + W) \\ &Rv \a (v + Wv) \;=\; (v - Vw) \\ &\frac{\p(Rv)}{\p w} \a \;-V \\\\ }$$