Given $n$ points in $\mathbb{R}^2$ (I think there won't exist any differences for $\mathbb{R}^n$) find the parameters $A$ and $v$ such that the sum of distances from the points to the line $A+tv$ is minimum.
I know that is a classic problem: find such a line that minimizes the sum of the squares of the distances, but for this problem I couldn't find a solution.
Given the points $\{(t_i,y_i)\}$, your cost function (sum of squares of distances) would be $$C=\sum_{i=1}^n\frac{(y_i-vt_i-A)^2}{1+v^2}=\frac{1}{1+v^2}\sum_{i=1}^n(y_i-vt_i-A)^2.$$ For reference, see the wiki article on the Distance from a point to a line, and using the equation $y=vt+A$, rearranged to $y-vt-A=0$. Now this is non-linear in the coefficients ($v$, at least), which is different from the usual linear least-squares regression where you minimize vertical distances. Let's calculate the partial derivatives: \begin{align*} \frac{\partial C}{\partial A}&=-\frac{2}{1+v^2}\sum_{i=1}^n(y_i-vt_i-A), \\ \frac{\partial C}{\partial v}&=\frac{(1+v^2)(-2)\sum_{i=1}^n(t_i)(y_i-vt_i-A)-(2v)\sum_{i=1}^n(y_i-vt_i-A)^2}{(1+v^2)^2}. \end{align*} Setting these equal to zero produces a few simplifications: \begin{align*} 0&=\frac{\partial C}{\partial A}=-\frac{2}{1+v^2}\sum_{i=1}^n(y_i-vt_i-A) \\ 0&=\frac{\partial C}{\partial v}=-2\,\frac{(1+v^2)\sum_{i=1}^n[t_i(y_i-vt_i-A)]+v\sum_{i=1}^n(y_i-vt_i-A)^2}{(1+v^2)^2} \\ \\ 0&=\sum_{i=1}^n(y_i-vt_i-A) \\ 0&=(1+v^2)\sum_{i=1}^n[t_i(y_i-vt_i-A)]+v\sum_{i=1}^n(y_i-vt_i-A)^2. \end{align*} Then you want to solve this system using gradient descent, or stochastic gradient descent.
It's highly doubtful you could get an analytic solution similar to the normal equations for linear regression. However, the first equation is linear, and it's straight-forward to solve for $A$: \begin{align*} 0&=\sum_{i=1}^n(y_i-vt_i-A) \\ 0&=\sum_{i=1}^n(y_i-vt_i)-nA \\ nA&=\sum_{i=1}^n(y_i-vt_i) \\ A&=\frac1n \sum_{i=1}^n(y_i-vt_i). \end{align*} With a view towards plugging into the other equation, we would need to change the summation variable: $$A=\frac1n \sum_{j=1}^n(y_j-vt_j). $$ Plugging into the other equation would yield $$0=(1+v^2)\sum_{i=1}^n\left[t_i\left(y_i-vt_i-\frac1n \sum_{j=1}^n(y_j-vt_j)\right)\right]+v\sum_{i=1}^n\left[y_i-vt_i-\frac1n \sum_{j=1}^n(y_j-vt_j)\right]^{\!2}. $$ You or may not have gained anything by the substitution.