I have a family of lines $y_i= a_ix+b_i$ where the pairs $(a_i,b_i)$ are generated by a random process.
I can fix $x$ and generate a family of points $y_i$ for which I can estimate the mean and the confidence bound. If I repeat this for all $x$, I can generate a function $\bar{y} = f(x)$ describing the mean value for each $x$, and another function $\delta_{y,c} = g(x)$ where $c$ is a confidence bound (e.g. 95%), so that for any value of $x$ I have a probability equal to $c$ that $a_ix+b_i$ falls within $\bar{y}(x) \pm \delta_{y,c}(x)$.
As an engineer, I'm solving this problem computing the mean and the confidence interval for a large enough set of $x$ values and then interpolationg between them.
However, is there a way to compute the functions $f(x)$ and $g(x)$ analytically without having to iterate over all possible values of $x$ (that in theory are infinite)?
For the sake of simplicity, let's assume $a_i$ and $b_i$ are normally distributed (but not uncorrelated).
I'm posting here the answer that was provided by @fedja in a comment above.
Just use the fact that in the jointly Gaussian case, for any fixed $x$, the corresponding $y$ is normal with the mean $xE[a]+E[b]$ and the variance $x2V[a]+V[b]+2xCov[a,b]$ and the standard table of z-values to determine the confidence interval for $y$.