Let $0<\alpha\le \beta$. I want to solve the minimax problem $$\min_{x\in\mathbb{R}\setminus\{0\}}\max_{t\in[\alpha,\beta]}\left | \prod_{i=1}^{n}(1-xt)\right |$$
Please note that I have no experience in optimization.
My attempt:
Since $$\left | \prod_{i=1}^{n}(1-xt)\right | = \prod_{i=1}^{n}|(1-xt)|=|1-xt|^n$$ and the transformation $x\mapsto x^n$ on the positive reals does not change the minimum, I can consider $$\min_{x\in\mathbb{R}\setminus\{0\}}\max_{t\in[\alpha,\beta]}\left | (1-xt)\right |$$
Now intuitively, I would argue that $x_*=\frac{1}{avg(\alpha,\beta)}=\frac{2}{\alpha+\beta}$ is the solution.
Is this correct and how would one solve this problem with use of optimization theory and how would I prove this without using any theory?
EDIT: I got the hint (from comments) to optimize $$\min_{x\in\mathbb{R}\setminus\{0\}}|x|\max_{t\in[\alpha,\beta]}\left | (1/x-t)\right |$$ instead and first find a closed form for the inner term.
I did not use any analysis for this, but used graphs to determine that the optimal $t_*$ from the inner term is given by $$t_*=\begin{cases}\alpha, \text{ if }\beta <x\text{ or } x>\frac{\alpha+\beta}{2}\\ \beta,\text{ if } x<\alpha\text{ or }x\le\frac{\alpha+\beta}{2}\end{cases}$$
I would then get that the minimum $x_*=\alpha\beta$, contraray to my initial guess. Can anyone confirm this?
Since $1-xt$ is linear, the inner maximum is attained at an extreme point (hence $t\in\{\alpha,\beta\}$, so that \begin{align*} \min_{x\in\mathbb{R}\setminus\{0\}}\max_{t\in[\alpha,\beta]}\left | (1-xt)\right |&= \min_{x\in\mathbb{R}\setminus\{0\}}\max\{\left | (1-x\alpha)\right |,\left | (1-x\beta)\right | \} \end{align*} Now you can separate the three folowing cases :
Now you can take the $\min$ over $x$ for all the intervals and take the $\min$ of the three. The last one is trivial $x=\frac{1}{\alpha}$ is the best choice and the minimum is $\frac{\beta}{\alpha}-1$. For the second one the idea is to take the intersection of the two lines given by $1-\alpha x$ and $\beta x-1$, which yields $x=\frac{2}{\beta+\alpha}$ and the associated value is $\frac{\beta-\alpha}{\alpha+\beta}$.
The last interval needs to be further decomposed into positive versus negative values of $x$. For negative values, $1-\min(\alpha x,\beta x)=1-x\beta$ and for positive values $1-\min(\alpha x,\beta x)=1-x\alpha$. Now taking the minimum over $x$ of those we obtain that the frist one is minimized for $x\to 0$ with value $1$, and the second one for $x\to\frac{1}{\beta}$ with value $1-\frac{\alpha}{\beta}$.
So in the end you get \begin{align*} \min_{x\in\mathbb{R}\setminus\{0\}}\max_{t\in[\alpha,\beta]}\left | (1-xt)\right | &= \min \left\{ \frac{\beta}{\alpha}-1,\frac{\beta-\alpha}{\beta+\alpha},1, 1-\frac{\alpha}{\beta} \right\}\\ &=\min \left\{ \frac{\beta-\alpha}{\alpha},\frac{\beta-\alpha}{\beta+\alpha},\frac{\beta+\alpha}{\beta+\alpha}, \frac{\beta-\alpha}{\beta} \right\}\\ &=\frac{\beta-\alpha}{\beta+\alpha} \end{align*} Which is attained when $x=\frac{2}{\alpha+\beta}$ and $t\in\{\alpha,\beta\}$ (any of the two will give you the same result).
Second (and better proof) :
It is known that the $\max$ of convex functions is a convex function. Hence $\max_{t\in[\alpha,\beta]}\left | (1-xt)\right |$ is a convex function of $x$ (if we include $0$ in the domain). It suffice to prove that $0$ is a subderivative of $\max_{t\in[\alpha,\beta]}\left | (1-xt)\right |$ at $x=\frac{2}{\alpha+\beta}$. The subderivatives at this point are in $[-\alpha,\beta]\ni 0$. This can be seen as the subderivatives of a $\max$ of convex functions are contained in the convex hull of the union of subderivatives of function that attain the $\max$ at that point, and at that point there are two function that attain the maximum, $1-\alpha x$ and $\beta x-1$.