In Hamilton's $1986$ paper "Four-manifolds with positive curvature operator", the setting is as follows: We identify the two-forms $\Lambda^2$ with the Lie algebra $\mathfrak{so}(n)$. Let $\{\phi^\alpha\}_{\alpha = 1}^{n(n-1)/2}$ be an orthonormal basis for $\mathfrak{so}(n)$. Then the Lie structure constants are the coefficients $c_\gamma^{\alpha \beta}$ for the Lie bracket: $$[\phi^\alpha, \phi^\beta] = c_\gamma^{\alpha \beta}\phi^\gamma. $$
When $n=3$, the matter is simpler since we have another identification $\mathfrak{so}(3) \simeq \mathbb{R}^3$, so that the Lie bracket corresponds to the cross product, and hence (with Hamilton's definition of inner product) $c_\gamma^{\alpha \beta} = \frac{1}{\sqrt{2}} \epsilon_{\gamma \alpha \beta},$ where $\epsilon_{\gamma \alpha \beta}$ are the elementary alternating 3-tensors. Then for the curvature tensor $R$, regarded as a symmetric bilinear form, we can show that $$(R^\#)_{\alpha \beta} := c_\alpha^{\gamma \eta} c_\beta^{\delta \theta} R_{\gamma \delta} R_{\eta \theta} = \frac{1}{2}\epsilon_{\alpha \gamma \eta}\epsilon_{\beta \delta \theta} R_{\gamma \delta} R_{\eta \theta} = (\mathrm{adj} \: R)_{\beta \alpha} = (\mathrm{adj} \: R)_{\alpha \beta}$$ since $R$ is symmetric.
Explicitly,
$$\begin{bmatrix} a & b & c \\ b & d & e \\ c & e & f \end{bmatrix}^\#=\begin{bmatrix} df-e^2 & ce-bf & be-cd \\ ce-bf & af-c^2 & bc-ae \\ be-cd & bc-ae & ad-b^2 \end{bmatrix}.$$
Can you work out $R^\#$ explicitly when $n = 4$ and perhaps $n = 5$? Or at least show me how to calculate the Lie structure constants in these cases? Thank you.
After putting more efforts, I now have a clearer understanding, and I'd like to share the details here.
First let's revisit the $n=3$ case, this time without using the identification $\mathfrak{so}(3) \simeq \mathbb{R}^3$. Throughout I'll use the definitions and notations as in Hamilton's paper. Note that in the literature some of them differ by a constant scaling factor.
Let $\{e_1, e_2, e_3\}$ be an orthonormal basis for the 1-forms. Then $\{ \phi^1 , \phi^2, \phi^3\} := \{\frac{1}{\sqrt{2}}e_1 \wedge e_2, \frac{1}{\sqrt{2}}e_1 \wedge e_3, \frac{1}{\sqrt{2}}e_2 \wedge e_3\}$ is an orthonormal basis for the 2-forms $\Lambda^2$, with inner product defined by $$ \langle \phi, \psi \rangle := \phi_{ab}\psi_{ab}, $$ so that $$\langle e_i \wedge e_j, e_k \wedge e_l \rangle = 2\big(\delta_{ik}\delta_{jl}-\delta_{il}\delta_{jk}\big).$$
The identification $\Lambda^2 \simeq \mathfrak{so}(3)$ is described by $$\phi^1 \leftrightarrow (\phi_{ab}^1) = \frac{1}{\sqrt{2}}\scriptstyle \begin{bmatrix} 0 & 1 & 0 \\ -1 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix}, \displaystyle \; \phi^2 \leftrightarrow (\phi_{ab}^2) = \frac{1}{\sqrt{2}} \scriptstyle \begin{bmatrix} 0 & 0 & 1 \\ 0 & 0 & 0 \\ -1 & 0 & 0 \end{bmatrix},\displaystyle \; \phi^3 \leftrightarrow (\phi_{ab}^3) =\frac{1}{\sqrt{2}} \scriptstyle \begin{bmatrix} 0 & 0 & 0 \\ 0 & 0 & 1 \\ 0 & -1 & 0 \end{bmatrix}.$$ The inner product we just stated transfers to $\mathfrak{so}(n)$ as $$\langle A, B \rangle = \mathrm{tr}(AB^T) = -\mathrm{tr}(AB).$$ (Recall that $\mathfrak{so}(n)$ consists of skew-symmetric matrices.)
Hamilton defined the Lie bracket on $\Lambda^2$ as $$ [\phi, \psi]_{ab} := \phi_{ac}\psi_{bc} - \psi_{ac}\phi_{bc},$$
which transfers to $\mathfrak{so}(n)$ as $$ [A, B] = AB^T - BA^T = BA - AB.$$
With all these definitions and identifications in place, we can now work out the Lie structure constants $c_{\gamma}^{\alpha \beta}$:
$[\phi^1, \phi^2] \leftrightarrow \displaystyle \frac{1}{\sqrt{2}} \cdot \frac{1}{\sqrt{2}} \scriptstyle\begin{bmatrix} 0 & 0 & 1 \\ 0 & 0 & 0 \\ -1 & 0 & 0 \end{bmatrix} \begin{bmatrix} 0 & 1 & 0 \\ -1 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix} - \displaystyle \frac{1}{\sqrt{2}} \cdot \frac{1}{\sqrt{2}} \scriptstyle\begin{bmatrix} 0 & 1 & 0 \\ -1 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix}\begin{bmatrix} 0 & 0 & 1 \\ 0 & 0 & 0 \\ -1 & 0 & 0 \end{bmatrix} $
$=\frac{1}{2} \scriptstyle\begin{bmatrix} 0 & 0 & 0 \\ 0 & 0 & 1 \\ 0 & -1 & 0 \end{bmatrix} \displaystyle \leftrightarrow \frac{1}{\sqrt{2}} \phi^3$, and hence $\displaystyle c_{3}^{12} = \frac{1}{\sqrt{2}}$.
Recall (in fact it is not hard to prove) that the Lie structure constant is antisymmetric in any pair of indices. Hence, $c_{3}^{12} = c_{1}^{23} = c_{2}^{31} = \frac{1}{\sqrt{2}}$ and $c_{3}^{21} = c_{1}^{32} = c_{2}^{13} = -\frac{1}{\sqrt{2}}$. All the others have repeated indices and thus are $0$. Therefore, we recover the conclusion that $c_{\gamma}^{\alpha \beta} = \frac{1}{\sqrt{2}} \epsilon_{\gamma \alpha \beta}$.
Now let's move on to the case $n = 4$.
Let $\{e_1, e_2, e_3, e_4\}$ be an orthonormal basis for the 1-forms. We could choose ${\{\phi^\alpha\}}_{\alpha = 1}^6$ as we did before, but then the final expression for $R^\#$ would be very cumbersome. Instead, Hamilton used the special fact that $\mathfrak{so}(4) = \mathfrak{so}(3) \oplus \mathfrak{so}(3)$ to choose the basis wisely. I'll skip the details about the Hodge star operator and so on; the takeaway is that we instead choose the following orthonormal basis ${\{\phi^\alpha\}}_{\alpha = 1}^6$ for the 2-forms $\Lambda^2$: $$\phi^1 = \frac{1}{2} (e_1 \wedge e_2 + e_3 \wedge e_4), \;\;\;\; \phi^2 = \frac{1}{2} (e_1 \wedge e_3 + e_4 \wedge e_2), \;\;\;\; \phi^3 = \frac{1}{2} (e_1 \wedge e_4 + e_2 \wedge e_3),$$ $$\phi^4 = \frac{1}{2} (e_2 \wedge e_1 + e_3 \wedge e_4), \;\;\;\; \phi^5 = \frac{1}{2} (e_3 \wedge e_1 + e_4 \wedge e_2), \;\;\;\; \phi^6 = \frac{1}{2} (e_4 \wedge e_1 + e_2 \wedge e_3).$$
The identification with $\mathfrak{so}(4)$ is similar. For instance, $$\phi^1 \leftrightarrow (\phi_{ab}^1) = \frac{1}{2}\scriptstyle \begin{bmatrix} 0 & 1 & 0 & 0 \\ -1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 \\ 0 & 0 & -1 & 0 \end{bmatrix}, \;\;\;\; \displaystyle \phi^2 \leftrightarrow (\phi_{ab}^2) = \frac{1}{2}\scriptstyle \begin{bmatrix} 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & -1 \\ -1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \end{bmatrix}. $$
One can check, using either one of the above two equivalent definitions of inner product, that indeed we have $\langle \phi^\alpha, \phi^\beta \rangle = \delta_{\alpha \beta}$.
Now the Lie structure constants. The amazing thing is that, because of our choice of the basis for $\Lambda^2$, most Lie brackets are $0$. Thus, the only nonzero Lie structure constants are $c_{3}^{12} = 1$, $c_{6}^{45} = 1$, and those resulting from permutations of indices in each of them.
Thus, let's partition $R$ as $\begin{bmatrix} A & B \\ B^T & C \end{bmatrix}$, where $A, B, C$ are $3 \times 3$ matrices, with $A$ and $C$ symmetric.
If $\alpha, \beta \in \{1, 2, 3\}$, then $$(R^\#)_{\alpha \beta} = \sum\limits_{\gamma, \eta, \delta, \theta = 1}^6 c_\alpha^{\gamma \eta} c_\beta^{\delta \theta} R_{\gamma \delta} R_{\eta \theta} = \sum\limits_{\gamma, \eta, \delta, \theta = 1}^3 c_\alpha^{\gamma \eta} c_\beta^{\delta \theta} R_{\gamma \delta} R_{\eta \theta} = \epsilon_{\alpha \gamma \eta} \epsilon_{\beta \delta \theta} A_{\gamma \delta} A_{\eta \theta} = 2(\mathrm{adj} \: A)_{\alpha \beta}.$$
Similarly, if $\alpha, \beta \in \{4, 5, 6\}$, then letting $\bar{\alpha} = \alpha - 3$ and $\bar{\beta} = \beta - 3$, we have $$(R^\#)_{\alpha \beta} = \sum\limits_{\gamma, \eta, \delta, \theta = 4}^6 c_\alpha^{\gamma \eta} c_\beta^{\delta \theta} R_{\gamma \delta} R_{\eta \theta} = \epsilon_{\bar{\alpha} \gamma \eta} \epsilon_{\bar{\beta} \delta \theta} C_{\gamma \delta} C_{\eta \theta} = 2(\mathrm{adj} \: C)_{\bar{\alpha} \bar{\beta}}.$$
If $\alpha \in \{1, 2, 3\}$ and $\beta \in \{4, 5, 6\}$, then letting $\bar{\beta} = \beta - 3$, we have (noting that $B$ is not symmetric, in general) $$(R^\#)_{\alpha \beta} = \sum\limits_{\gamma, \eta = 1}^3 \sum\limits_{\delta, \theta = 4}^6 c_\alpha^{\gamma \eta} c_\beta^{\delta \theta} R_{\gamma \delta} R_{\eta \theta} = \epsilon_{\alpha \gamma \eta} \epsilon_{\bar{\beta} \delta \theta} B_{\gamma \delta} B_{\eta \theta} = 2(\mathrm{adj} \: B)_{\bar{\beta} \alpha}.$$
Similarly, if $\beta \in \{1, 2, 3\}$ and $\alpha \in \{4, 5, 6\}$, then letting $\bar{\alpha} = \alpha - 3$, we have $$(R^\#)_{\alpha \beta} = \sum\limits_{\gamma, \eta = 4}^6 \sum\limits_{\delta, \theta = 1}^3 c_\alpha^{\gamma \eta} c_\beta^{\delta \theta} R_{\gamma \delta} R_{\eta \theta} = \epsilon_{\bar{\alpha} \gamma \eta} \epsilon_{\beta \delta \theta} (B^T)_{\gamma \delta} (B^T)_{\eta \theta} = 2(\mathrm{adj} \: B)_{\bar{\alpha} \beta}.$$
Therefore, depending on which form you like best, $$\begin{bmatrix} A & B \\ B^T & C \end{bmatrix}^\# = 2\begin{bmatrix} \mathrm{adj} \: A & (\mathrm{adj} \: B)^T \\ \mathrm{adj} \: B & \mathrm{adj} \: C \end{bmatrix} = 2\begin{bmatrix} A^\# & B^\# \\ (B^\#)^T & C^\# \end{bmatrix},$$ or if you will,
$\scriptstyle \begin{bmatrix}\begin{array}{@{}c|c@{}} \begin{matrix} a & b & c \\ b & g & h \\ c & h & l \end{matrix} & \begin{matrix} d & e & f \\ i & j & k \\ m & n & o \end{matrix} \\ \hline \begin{matrix} d & i & m \\ e & j & n \\ f & k & o \end{matrix} & \begin{matrix} p & q & r \\ q & s & t \\ r & t & u \end{matrix} \end{array} \end{bmatrix}^\# = \; 2\begin{bmatrix}\begin{array}{@{}c|c@{}} \begin{matrix} gl-h^2 & ch-bl & bh-cg \\ ch-bl & al-c^2 & bc-ah \\ bh-cg & bc-ah & ag-b^2 \end{matrix} & \begin{matrix} jo-kn & km-io & in-jm \\ fn-eo & do-fm & em-dn \\ ek-fj & fi-dk & dj-ei \end{matrix} \\ \hline \begin{matrix} jo-kn & fn-eo & ek-fj \\ km-io & do-fm & fi-dk \\ in-jm & em-dn & dj-ei \end{matrix} & \begin{matrix} su-t^2 & rt-qu & qt-rs \\ rt-qu & pu-r^2 & qr-pt \\ qt-rs & qr-pt & ps-q^2 \end{matrix} \end{array} \end{bmatrix}$.
As a reminder, we have $M^\# = (\mathrm{adj} \: M)^T$ for a general $3 \times 3$ matrix $M$, but when $M$ is symmetric, we also have $M^\# = \mathrm{adj} \: M$. By $\mathrm{adj} \: M$, I mean the adjugate/adjunct matrix, which is the transpose of the cofactor matrix.
For $n \geq 5$, unless there is special structure of $\mathfrak{so}(n)$ that we can take advantage of, the explicit expression for $R^\#$ will in general be very messy, but the steps I provided here should indicate how one should proceed.