I have some problems in understanding this problem, because I'm stuck in some purely mathematical definitions and do not know how to proceed, appreciate to some that I can say which is the best way forward to solve this problem, the problem is as follows:

Given two conditions to define a ellipse:
A. By the condition that the distances $a'$, $a''$ between any two points on the ellipse and the two focal points add to a constant $$ a'\text{+}a''\text{=}2a $$
given by twice the length a of the long axis and,
B. by the equation
$$ 1 = \frac{x^{2}}{a^{2}} +\frac{y^{2}}{b^{2}}$$
in Cartesian coordinates with the $x$ and $$-axis coinciding with the long and the short axis of the ellipse respectively.
- Show $A$ is a sufficient condition for $B$.
- Show that $A$ is also a necessary condition for $B$.
I appreciate your comments and insights.
This is not a complete solution but a guide to solve your problem.
$\sqrt{(x+c)^2 + y^2} + \sqrt{(x-c)^2 + y^2} = 2a$
where $c$ = distance of a focus from the origin.
Simplify this equation using $c^2 = a^2 - b^2$ to get the equation $B$.
Hope this helps.