I was reading this article on the flipout method https://arxiv.org/pdf/1803.04386.pdf and at page $4$ the author provides the equation that describes the activations in one layer of a neural net.
$r_n$ and $s_n$ are $2$ random vectors of $\pm 1$ that gets multiplied together to create a random sign matrix as: $$y_n= \phi (W^T x_n)= \phi \biggl(\bigl(\bar{W}+ \widehat{\Delta W}\circ r_n s_n^T\bigr)^T x_n\biggr)= \phi \biggl(\bar{W}^T x_n+ \bigl( \widehat{\Delta W}^T (x_n \circ s_n)\bigr)\circ r_n \biggr) $$
where $\circ$ represent the element-wise multiplication and the subscript $n$ represent the $n^{th}$ element of the mini-batch. I do not understand what property has been used on the right side between the second and the last passage
