So imagine that I have a ray parameterized as $\vec{R} = \vec{O} + t\vec{D}$, where $\vec{O}$ = origin, $t$ = parameter and $\vec{D}$ = direction vector.
I also have a spherical lens with aperture $A$, radius of curvature $r$, and center of curvature $C$.
Assume that the lens is centered on the $z$-axis and that the ray always comes in from "one side" of the lens. So if it's a convex lens, the rays will always come from its convex direction and not from it's backside.
So this is what I'm doing to solve this:
1) Treat the lens as a full sphere with center $C$ and radius $r$.
2) Do a normal ray-sphere intersection algorithm and return the closest of the two points.
3) Once I have an intersection point $(x,y,z)$, I check to see if $ |x| \le A/2$ and $|y| \le A/2$.
The reason for step 3 is that the aperture $A$ refers to the width of the lens, so if it's centered on the $z$-axis, the range of $x$ values on the lens' surface range from $-A/2 \le x \le A/2$ (and the same for $y$) even though the range of $x,y$ values for the sphere will be from $-r \le x,y \le r$.
So my question is: Is this a reasonable way to go about solving this problem? I've sort of convinced myself in my mind that this is correct, but perhaps if there is a flaw in my reasoning I would really appreciate the feedback!
It looks OK to me, almost.
One remark about step #3. I don't really know what "aperture" means, but it sounds like the diameter of a circular "hole" in front of the lens. If so, I think the check in step #3 should be $\sqrt{x^2 + y^2} < A/2$. The test you're doing would correspond to a square aperture, rather than a circular one.