Here's an image taken from the article: Frequentism and Bayesianism IV: How to be a Bayesian in Python.
Since I can't add images, here's the link:
It depicts lines generated with slopes between 0 and 10 in steps of 0.1
What accounts for the bunching of the line with higher slopes?
Background: In the article, the author provides this as a pedagogical example to not automatically consider flat priors.
Thank you.
I'll use $\alpha_n$ to mean the angle that the $n$-th line makes with the $x$-axis. The fact that the slopes are equally spaced from $0$ to $10$ means that the tangents of the angles are equally spaced, because the slope of the $n$-th line is $\tan\alpha_n$. But equally spaced tangents doesn't mean equally spaced angles. If you look at the graph of the tangent function, $x\mapsto\tan x$, you'll see that it gets extremely steep when $x$ is very slightly less than $\pi/2$. This means that, when the angles are just slightly below a right angle (i.e., when the lines are almost vertical), a very small change in angle can make a huge difference to the tangent of that angle. Equivalently, ordinary-sized (like $1$) changes in the tangent come from really tiny changes in the angle. That's what you're seeing in the picture. For large $n$, the difference of tangents, $\tan\alpha_{n+1}-\tan\alpha_n$, is simply $1$ but the corresponding difference of angles $\alpha_{n+1}-\alpha_n$ is tiny. So the lines look bunched together.