I have a 3D half-plane defined using a line segment an a point (as shown in picture taken from here).
I am wondering how I can detect if a point belongs to the half-plane. Is there any way to calculate the minimum distance between point and half-plane?
Let $p$ and $v$ be the points determining the initial line segment, and let $w$ be the third point, as suggested by your image. With this notation, $w-p$ and $v-p$ represent the top and bottom arrows in the image, respectively.
A normal vector to your plane is $$n = (v-p)\times(w-p),$$ (where $\times$ stands for the vector cross product), and the correct half of the plane is determined by the perpendicular direction to your segment $(v-p)$ projected by $(w-p)$. The latter can be computed in terms of $\widehat{ v-p } := \frac{v-p}{\| v-p\|}$, that is, $$d = (w-p) - \langle w-p, \widehat{v-p} \rangle \widehat{v-p},$$ where $\langle \cdot\,, \cdot \rangle$ stands for the scalar (dot) product.
In terms of these you can check the two conditions that $x$ needs to satisfy in order to belong to your half plane:
Note (after the OP's edit).
I would expect calculating the minimum distance from a point to the half plane to be computationally more expensive than verifying its membership, please correct me if I'm wrong.