Suppose I have got $A\subseteq\mathbb{R}^n$ such that $|A|<+\infty$, where $|\cdot| $ indicates the Lebesgue measure. I define $$|\partial A|:=\liminf_{\epsilon\to0}\frac{|(A+\epsilon K)\backslash A|}{\epsilon},$$ where $K$ is a convex compact symmetric subset whose internal part is non-empty (that is, it defines a norm on $\mathbb{R}^n$). Now, by Brunn-Minkowski inequality, one may very well see that if $|A|=|rK|$, then $$|A+\epsilon K|^{1/n}\ge|A|^{1/n}+|\epsilon K|^{1/n}=|rK+\epsilon K|^{1/n},$$ where the last equality is due to the fact that $|A|=|rK|$ and $K$ is convex. Now, by my definition of $|\partial A|$ one gets $$ |\partial A|\ge|\partial(rK)|. $$ But this should only be valid when $K=\bar{\mathbb{B}^n_2}$, that is, the closed euclidean ball. What is wrong here?
Remark. The definition of $|\partial A|$ is independent of $K$, as all of the norms on $\mathbb{R}^n$ are equivalent (is it enough?).