I am interested in an algorithm to compute the area of a compact, convex body $K \subset \mathbb{R}^2$, up to an error $\epsilon > 0$. I am curious about the optimal running time of such an algorithm, but I would be happy with a simple algorithm that is provably close to the optimum.
Specifically, you get the set $K$ in the form of a 'membership oracle,' which, for any $x \in \mathbb{R}^2$, tells you whether or not $x \in K$ for a cost $c$. Also, assume for simplicity that $K$ is contained in a fixed compact set, say the unit disk. Now fix an error $\epsilon > 0$: what is the running time (in terms of $\epsilon$ and $c$) to find an approximate answer $A$, i.e. $|A - area(K)| < \epsilon$?
Here are some simple ideas, ranked according to how fast I believe them to be:
1) generate a lattice with sufficiently small mesh size, and count how many points belong to $K$.
2) Approximate the set by a convex polygon, and compute its area. I think the last step is (in theory) reasonably quick, via triangulating the polygon and summing the area of the triangles.
3) Determine an approximate boundary of $K$, then numerically integrate over the boundary and use Green's theorem to get the area.
Also, I am OK with probabilistic algorithms: for example, one can run a Monte-Carlo simulation, i.e. sample points randomly according to area measure on the unit disk, and record how many land inside $K$. This should (I think) take time $O(c/\epsilon)$ with probability at least $1-\epsilon$. (This is essentially parameter estimation for a Bernoulli r.v.) I suspect this is actually optimal.
I have found papers on this topic in arbitrary dimension, and some ideas/results for 2-D, but I have been unable to find the state of the art for 2-D. I would also be interested in a proof of any lower bound on the running time.
Edit 1: I have done the analysis for method 2 above, namely approximating with a convex polygon and finding the area. The running time comes out to $O(\frac{c}{\epsilon} \log \frac{1}{\epsilon})$, which is pretty close to (what I expect to be) the optimum of $O(c/\epsilon)$.
Edit 2: I have also done the analysis for method 1, i.e. just using a square mesh. I believe the running time comes out to $O(c \epsilon^{-2})$, which is worse than the convex polygon method. Basically this is because one has to check every square in the mesh (of which there are $\epsilon^{-2}$), while the other method 'finds' the boundary of $K$ in $-\log(\epsilon)$ time and does the area calculation on that set, which is length $\epsilon^{-1}$.
One can answer the original question, and possibly better understand the costs involved, by splitting it into two parts.
The first is: how do we find any one point of a convex set $K$ in the unit disk given a membership oracle, and knowing that the convex set has area at least $\epsilon$ (otherwise, $\epsilon$ is an $\epsilon$-additive approximation of the area)? It's easy to see that taking $O\left(\frac{1}{\epsilon}\log\frac{1}{\epsilon}\right)$ samples uniformly and independently at random and querying the oracle for each yields a point with probability at least $1-\epsilon$. It's harder to prove that any algorithm requires at least $\Omega\left(\frac{1}{\epsilon}\log\frac{1}{\epsilon}\right)$ queries to guarantee the same probability, but I believe it's true ($\Omega(\frac{1}{\epsilon})$ is, of course, trivial).
The second part is: how can we find an (additive) $\epsilon$-approximation of the area of a convex set $K$ in the unit disk given a membership oracle and a generic point belonging to $K$? This is slightly more complicated, but essentially can be carried out by finding two convex polygons $P_1,\dots,P_n$ and $P'_1,\dots,P'_n$, the former containing and latter contained within $K$ and such that the distance between $P_i$ and $P'_i$ is sufficiently small. I believe the cost can be pushed down to $O(\epsilon^{-2/5})$ for a Montecarlo algorithm with, say, $2/3$ probability of success. It's easy to see that one needs always at least $\Omega(\epsilon^{-1/3})$ queries, taking a regular $2n$-agon inscribed in the unit circle, and considering the $n$ triangles each formed by $2$ consecutive even-indexed vertices and the odd-indexed vertex between them. Each such triangle has base $\Theta(n^{-1})$ and height $\Theta(n^{-2})$, and each can be independently "snipped" without compromising the convexity of polygon; so that for some sufficiently small $n=\Theta(\epsilon^{-1/3})$ the area of each triangle exceeds $\epsilon$, and one must check with a distinct query the presence or absence of each triangle (save possibly one) to determine the area of the polygon within an additive factor $\epsilon$.
For both parts, it's not difficult to prove that the upper bound on the number of queries is also asymptotically an upper bound on the number of all other operations involved.