I'm reading this article and I'm having some problems in understanting basics or notation.
Article is about detecting changepoint with bayesian approach. So we have vector with data $Y = \{y_1, y_2,\ldots, y_n\}$ and if there are $k$ changes in moments $t_1,\ldots, t_k$ we split $Y$ into $(y_1,\ldots,y_{t_1-1}), (y_{t_1},\ldots,y_{t_2-1}),\ldots, (y_{t_k},...y_n)$. And after that my problem begins. In the article we read
In a Bayesian formulation the joint posterior distribution for the latent changepoint indicator vector $z$ and segment parameters $θ = \{θ_1,\ldots,θ_{k+1}\}$ can be written as a product of the full segment likelihood $(2.2)$ and the priors for $z$ and $θ$, (...)
where $\theta$ is a parametr from likelihood estimation.
I'm not a native english speaker so I don't really understand that sentence and so on the next equation
that $\pi$ function came from nothing (for me) and what does "$\alpha$" mean?
Greetings