For logistic regression, $h_\theta(x)=\frac{1}{1 + e^{-\theta^{T}x}}$ is the sigmoidal function. So, $-\log(1 - h_\theta(x))$ = $-\log{\frac{1}{1 + e^{\theta^{T}{x}}}}$. So, if
$h_\theta(x)$ tends to $0$ then $e^{\theta^{T}x} = 0$, So, the cost should be $0$ should not be $-\infty$, which makes sense since the cost should be minimum for correct prediction where $h_\theta(x)$ matches the actual $y$ value.
For logistic regression, $h_\theta(x)=\frac{1}{1 + e^{-\theta^{T}x}}$ is the sigmoidal function. So, $-\log(1 - h_\theta(x))$ = $-\log{\frac{1}{1 + e^{\theta^{T}{x}}}}$. So, if $h_\theta(x)$ tends to $0$ then $e^{\theta^{T}x} = 0$, So, the cost should be $0$ should not be $-\infty$, which makes sense since the cost should be minimum for correct prediction where $h_\theta(x)$ matches the actual $y$ value.