My question is given a ellipse of the equation : $$\frac{x^2}{a^2}+\frac{y^2}{b^2} =1$$ where $a>b$ then how we can find the coordinates of the foci. I want to find those coordinates without the presuming that the foci exists because most proofs I found online assume the properties of foci to be true and then take some extreme case to find foci's coordinates.
So to rephrase my question: Given a closed curve of the equation $$\frac{x^2}{a^2}+\frac{y^2}{b^2} =1$$ prove that there exists two points inside the curve such that if we take any point on the boundary of the curve and join it to those two points then the sum of those lengths will give a fixed constant based on $a$ and $b$ (Assume $a>b$).
Here's my attempt which gave me nothing:
Let's take a point $P$ on the curve as $\left(x,\ b\sqrt{1-\frac{x^2}{a^2}}\right)$
Let those two points be $(-f,0)$ and $(f,0)$ then the sum of lengths from point $P$ becomes
$$S = \sqrt{(f+x)^2+b^2\left(1-\frac{x^2}{a^2}\right)}+\sqrt{(f-x)^2+b^2\left(1-\frac{x^2}{a^2}\right)}$$
Differentiating this wrt to $f$ and equating to $0$ to find the stationary case
$$\dfrac{f+x}{\sqrt{\left(f+x\right)^2+b^2\left(1-\frac{x^2}{a^2}\right)}}+\dfrac{f-x}{\sqrt{\left(f-x\right)^2+b^2\left(1-\frac{x^2}{a^2}\right)}}=0$$
Squaring and simplifying
$$4bfx\left(1-\frac{x^2}{a^2}\right)=0 \implies f =0$$
which is obviously wrong...
Note that in my attempt I too assumed two points, that the foci will be symmetrical and on the major axis, if we can even take these assumptions out that would be amazing. It's just that with these assumptions I was able to at least start somewhere.
P.S.
$\textbf{thanks to the comments by @Blue}$
differentiating $S$ wrt $x$ and equating to zero
$$\dfrac{2\left(x+f\right)-\frac{2b^2x}{a^2}}{2\sqrt{\left(x+f\right)^2+b^2\cdot\left(1-\frac{x^2}{a^2}\right)}}+\dfrac{-\frac{2b^2x}{a^2}-2\left(f-x\right)}{2\sqrt{b^2\cdot\left(1-\frac{x^2}{a^2}\right)+\left(f-x\right)^2}}=0$$ Squaring and simplifying
$$ a^4b^2fx(f^2-(a^2-b^2))=0$$
So, $$f= \sqrt{a^2-b^2}$$
The sum of lengths of $(-f, 0)$ and $(f, 0)$ from point P is $$S = \sqrt{(f+x)^2+b^2\left(1-\frac{x^2}{a^2}\right)}+\sqrt{(f-x)^2+b^2\left(1-\frac{x^2}{a^2}\right)}$$ Your approach was correct upto this step. However, you need to realize what happens when you differentiate S with respect to f.
The condition $\frac{{\partial}S}{{\partial}f} = 0$ considers the points $(-f, 0)$ and $(f, 0)$ to be variable, and finds the minimum value of the sum of distances $S$. Intuitively, the sum of distances is minimum when both the points coincide with the center of the ellipse. Hence, you get the value $f = 0$.
What you really need to do to find the focal points is to find a value of $f$ such that the expression for $S$ is independent of the parameter $x$.
With a little bit of algebraic manipulation you get $$f = a\sqrt{1-\dfrac{b^2}{a^2}}$$
And this is how you get the coordinates of the focal points.
If you want to solve the problem without assuming that the foci are symmetrically placed on the major axis, you can assume arbitrary coordinates $(x_1, y_1)$ and $(x_2, y_2)$ for them, derive a general expression for $S$, and then solve for either $\frac{{\partial}S}{{\partial}h} = 0$ or $\frac{{\partial}S}{{\partial}k} = 0$, where $(h, k)$ are the coordinates of a point P on the ellipse.