Develop the Cartesian equation of a plane with $x$-intercept $a$, $y$-intercept $b$ and $z$-intercept $c$. Show that the distance $d$ from the origin to this plane is given by $$\frac{1}{d^2}=\frac{1}{a^2}+\frac{1}{b^2}+\frac{1}{c^2}$$
In the picture below I have included what I have done so far. I was able to set all my variables and begin to define them.

Another attempt, since the OP finds my first answer confusing ...
Let $A = (a,0,0)$, $B = (0,b,0)$, $C = (0,0,c)$. The normal to the plane is in the direction $N = (B-A)\times(C-A) = (bc, ca, ab)$. A unit vector in this direction is $$ U = \frac{N}{\|N\|} = \frac{(bc,ca,ab)}{\sqrt{b^2c^2 + c^2a^2 + a^2b^2}} $$ The distance $d$ from the origin to the plane is the length of the projection of $\vec{OA}$ onto the vector $U$. Since $U$ is a unit vector, this projected length is just $A \cdot U$. So, we have $$ d = A \cdot U = \frac{abc}{\sqrt{b^2c^2 + c^2a^2 + a^2b^2}} $$ This formula is correct even if one of $a$, $b$, $c$ is zero and the other two are non-zero, in which case it gives $d=0$, as you would expect.
If $a$, $b$, $c$ are all non-zero, this formula can be rearranged to give $$ \frac{1}{d^2} = \frac{1}{a^2} + \frac{1}{b^2} + \frac{1}{c^2} $$