Why in logistic regression for every threshold the decision boundary is a hyperplane?

74 Views Asked by At

I'm a beginner in machine learning study and I can't figure it out an exercise:

The function h(x) = θ(w ̃x) is used to approximate P(y = +1 | x). So x = +1 if h(x) > T and -1 if h(x) < T for a certain threshold T ∈ [0, 1] . If h(x) = T then x is in the decision boundary. The question is: how can I matematically prove that for any threshold T the decision boundary is an hyperplane?