I have a question about sufficient conditions for a submodular function to be (globally) minimized. Let $f: 2^{\{1,...,n\}} \to \mathbb{R}$ be a submodular function, and let $S^* \subset \{1,...,n\}$ be the unique minimum of $f$ (assume that $f$ has a unique minimum). My question is:
Is it a sufficient condition for $S^*$ to be a minimum is that all the discrete derivatives of $f$ satisfy
$$\partial_i^+(S^*) := f(S^* \cup \{i\}) - f(S^*) > 0 \text{ for all } i \not \in S^*$$ $$\partial_i^-(S^*) := f(S^* ) - f(S^* - \{i\}) < 0 \text{ for all } i \in S^*$$
I believe that this should be true, but I have not seen it anywhere (I am reading Fujishige's book on submodular optimization and Francis Bach's book http://arxiv.org/pdf/1111.6453v2.pdf) and I am afraid of using something that is false. The reason I suspect it is true is because we can extend $f$ to a convex function $f^L:[0,1]^n \to \mathbb{R}$ (the Lovasz extension) as follows. For any $x \in [0,1]^n$ let $\sigma$ be a permutation of $\{1,...,n\}$ such that $x_{\sigma(1)} \geq x_{\sigma(2)} \geq ... \geq x_{\sigma(n)}$. Let $S_i = \{1,...,i\}$ Then we can define the Lovasz extension at $x$ as $$f^L(x) = \sum_{i=1}^{n-1} (x_{\sigma(i)} - x_{\sigma(i+1)}) f(S_i) + f(S_n)$$
The Lovasz extension is a convex function whose minimum $x^*$ satisfies $x^*_i = \begin{cases}1 \text{ if } i \in S^* \\ 0 \text{ otherwise} \end{cases}.$ Furthermore, since $x^*$ is at the boundary of $[0,1]^n$, the first-order conditions for $x^*$ to be a minimum of $f^L$ are $$\lim_{h_i \to 0^+} \frac{f^L(x^*+ (0,...,0,h_i,0,...,0)) - f^L(x^*)}{h_i} \geq 0 \text{ for all } i \not \in S^*$$ $$\lim_{h_i \to 0^-} \frac{ f^L(x^*) - f^L(x^*+ (0,...,0,h_i,0,...,0))}{h_i} \leq 0 \text{ for all } i \in S^*$$
These conditions work out to be the same as the discrete derivative conditions I stated at the beginning of the question.
No, unfortunately it's not sufficient. Consider the function when $n=2$:$$ f(S) = \begin{cases} 0 &\text{ if } |S| = 0 \\ 1 &\text{ if } |S| = 1 \\ -2 &\text{ if } |S| = 2 \end{cases} $$ The empty set satisfies your condition, but obviously it is not a minimum of $f$.
To see why your first order conditions don't work, consider the corresponding Lovasz extension: $$ f^L(x)=2|x_1-x_2|-x_1-x_2 $$ We have $f^L(x_1,0) \ge f^L(0,0)$ for $x_1 \ge 0$, and $f^L(0,x_2)\ge f^L(0,0)$ and $x_2 \ge 0$, but $(0,0)$ is not a minimum over the unit square.
If we were to prove that $(0,0)$ is a minimum of the convex function $f^L(x_1,x_2)$ over the unit square, the correct first order condition is that there is a single subgradient $(g_1,g_2) \in \partial f^L(0,0)$ which satisfies $g_1 \ge 0$ and $g_2 \ge 0$. Your condition implies is that there is one subgradient with $g_1 \ge 0$ and another one with $g_2 \ge 0$. (For differentiable convex functions, subgradients are unique, so this would be a sufficient condition, but in general Lovasz extensions are not differentiable.)
In the above example, the subgradients are: $$ \partial f^L(x_1,x_2) = \begin{cases} (1,-3) &\text{ if } x_1 > x_2 \\ \text{conv}\{(1,-3),(-3,1)\} &\text{ if } x_1=x_2 \\ (-3,1) &\text{ if } x_1 < x_2 \end{cases} $$ So to use a first order condition to prove minimality, the fact that $(-1,-1)\in \partial f^L(1,1)$ implies that $S$ is a minimizer of $f$. Of course it's obvious what the minimum is, but I think it's helpful to look at simple examples like this.