Let's say we have an object that in reality has a size A m, know it appears on the image plane with a size A'. We want to know the distance between the optical center (the lens) and the object. What we have is the thin lens formula
$$\frac{1}{f} = \frac{1}{|OA|} + \frac{1}{|OA'|}$$
with |OA| distance between lens and object and |OA'| distance between lens and image plane. Furthermore we have the ratio between the size of real object and projected object
$$\lambda = \frac{|OA'|}{|OA|}$$
Now we want to know $|OA|$. How to do this?
What I tried is $|OA'| = \lambda |OA|$ and substituting this in the thin lens formula. This leads to $$\frac{1}{f} = \frac{1}{|OA|} + \frac{1}{\lambda|OA|}$$ and $$|OA| = \frac{f(\lambda+1)}{\lambda}$$
However resulting distances are way too large. How to do this and what am I doing wrong?
Lens formula(Take origin at optical center and use sign conventions, leftwards distance negative and otherside positive or the reverse as per your taste): $$\frac 1v-\frac 1u=\frac 1f\Leftrightarrow v=\frac{uf}{u+f}$$ Also(use sign conventions): $$m=\frac vu\implies v=mu$$ So: $$\frac{uf}{u+f}=mu\implies(u\neq0)f=mu+mf\implies u=\frac{(1-m)}{m}f$$
Note: $\quad\lambda\mapsto m,\quad OA\mapsto u, \quad OA'\mapsto v$