I am a beginner to sheaf theory; so please bear with me.
Consider the sheaf of differentiable functions on $\mathbb R^2$, and the subsheaf consisting of functions vanishing at $0$.
Question 1:
How would you show that $\mathcal F$ is not a locally free sheaf over $\mathcal A$?
Question 2:
If instead of $\mathbb R^2$ we were considering $\mathbb R$, then is the ideal sheaf at $0$ locally free?
Let $\mathcal{F}$ denote the subsheaf in question. If it were locally free then there would be some open set containing $0$ on which $\mathcal{F}\vert_U \cong \mathcal{O}^{J}_{\mathbb{R}^2}\vert_U$, where $J$ is some set. But on every open set containing zero, the latter sheaf has either all nonzero stalks or all zero stalks (if $J = \varnothing$) whereas the former has stalks of both types. Isomorphisms of sheaves induce isomorphisms of stalks, so we have a contradiction.