For $x,y\in\mathbb R^n$ define $P(z) = \sum_{i=1}^n z^i x_i y_i$. I want to minimize the function $$ \frac{1-P(z)^2}{P'(z)^2} =\frac{1-(\sum_{i=1}^n z^{i} x_i y_i)^2}{(\sum_{i=1}^n i z^{i-1} x_i y_i)^2} $$ for a given $z\in[-1,1]$ over all $x$ and $y$ restricted by $\|x\|_2=\|y\|_2=1$.
Conjecture: For any $z\in[-1,1]$ there is some $i\in[1,n]$, such that the optimal solution is to put all the weight of $x$ and $y$ on some $i$. That is, let $x_i=y_i=\pm 1$, and put $0$ on the remaining coordinates. I believe this is known as a ''bang-bang'' solution.
I can prove the conjecture for $z$ close to $0$: Using a Taylor expansion $$ \frac{1-P(z)^2}{P'(z)^2} =\frac{1-P(0)^2}{P'(0)^2} + O(z) =\frac{1}{(x_1y_1)^2} + O(z), $$ it's clear that in the case $z=0$ we should put all the weight on the first two coordinates.
Numerically I have verified the conjecture for many other choices of $z$ and $n$, but I barely know where to start for the proof.
Update: By the suggestion of Michael Grant we can define $$ f_z(w) = \sum_{i=1}^n z^i w[i] \quad\text{and}\quad \bar{f_z}(w) = \sum_{i=1}^n i z^{i-1} w[i], $$ and the problem then reduces to showing that the function $$\frac{\bar{f_z}(w)^2}{1-f_z(w)^2}$$ is convex in $w\in\mathbb R^d$, such that $\|w\|_1\le 1$. Numerically I have verified this is true for $n$ up to 4.
Update 2: For any $\alpha\ge 0$, the set of $w$ such that $\frac{\bar{f_z}(w)^2}{1-f_z(w)^2}\le \alpha$ is the same as the set such that $\bar{f_z}(w)^2+\alpha f_z(w)^2\le \alpha,$ which is clearly convex, as the function $\bar{f_z}(w)^2+\alpha f_z(w)^2$ is a positive sum of convex functions.
That shows $\frac{\bar{f_z}(w)^2}{1-f_z(w)^2}$ is quasi-convex, which should be enough to prove the original conjecture!
Proof: We prove the lemma by induction. For $n=2$ let $x=(\alpha,\sqrt{1-\alpha^2})$ and $y=(v_1/\alpha, \text{sign}(v_2)\sqrt{1-(v_1/\alpha)^2})$. For $|\alpha|\le|v_1|$ these are unit vectors with $x_1 y_1=v_1$. We further need $(1-\alpha^2)(1-(v_1/\alpha)^2)=v_2^2$. This has a solution exactly when $d=(1+v_1^2-v_2^2)-4v_1^2\ge 0$, which is equivalent with $|v_1|+|v_2|\le 1$. Then $\alpha^2 = ((1+v_1^2-v_2^2)-\sqrt{d})/2\le v_1^2$, which is what we needed.
For $n>2$ let $x_n=\sqrt{|v_n|}$ and $y_n=\text{sign}(v_n)\sqrt{|v_n|}$. Let $s=\sqrt{1-|v_n|}$. By induction (and scaling) there exists vectors $\bar{x},\bar{y}\in\mathbb R^{n-1}$ with $\|\bar{x}\|_2=\|\bar{y}\|_2=s$ such that $\bar{x}_i \bar{y}_i=v_i$ for all $i\in [n-1]$, as long as $\sum_{i=1}^{n-1} |v_i|\le \|\bar{x}\|_2\|\bar{y}\|_2=s^2$. In our case we have $\sum_{i=1}^n |v_i|\le 1$ so $s^2=1-|v_n|\ge \sum_{i=1}^{n-1} |v_n|$, which completes the proof.
Combined with Cauchy-Schwarz we have proven Michael Grant's conjecture that $\{v : v_i=x_i y_i, i=1\dots n, \|x\|_2=\|y\|_2=1\}$ is exactly the $\ell_1$ ball.
Combined with the proof of convexity of the function given in the question update, we have a complete proof.