Ipopt uses alphaMin=1.0e-8, i.e., IPOPT will give up the backtracking line-search procedure whenever alpha falls below alphaMin. Now I'd like some parameter that is less error-prone to scaling. I thought of $$ \alpha_{min} = 10^{-6} \cdot \frac{ \phi' }{ \|x\| }$$ or $$ \alpha_{min} = 10^{-6} \cdot \frac{ \phi' }{ |\phi| }$$ or $$ \alpha_{min} = 10^{-6} \cdot \frac{ \phi' }{ \|\Delta x\| }$$ where $\phi$ is the cost- or merit-function (whatever context you want to consider). The 1.0e-6 will of course be any arbitrary reasonably small constant.
The algorithms in focus of my question are Newton-type methods. Clearly, basing alphaMin on the alpha of the previous step is futile, as steps are approximately conjugate, hence all previous step information is useless in the present.