I have two functions $f(x)=(x-K)_+$ and $g(x) = \max\{x,K\}, x \geq 0, K = const \geq 0$.
I was told that
$$f'(x) = \mathbb{I}_{[K,+\infty)}(x)$$ and $$f''(x)=\delta_K(x),$$
because $(\mathbb{I}_{[K,+\infty)}(x))'_x=\delta_K(x)$ and $f(x)=(x-K)_+ = (x-K)\mathbb{I}_{[K,+\infty)}(x)$, where $\delta_K(x)$ is a delta-function.
Now I want to get derivatives of $g(x)$ and I was told that they are the same, but I can't understand why. My idea is that there should be something like ($x\geq 0$):
$$g(x) = \max\{x,K\} = x\mathbb{I}_{[K,+\infty)}(x)+K\mathbb{I}_{[0,K)}(x) = x\mathbb{I}_{[K,+\infty)}(x) + K(1 - \mathbb{I}_{[K,+\infty)}(x)) $$
so the first derivative is
$$g'(x) = \mathbb{I}_{[K,+\infty)}(x) + x\delta_K(x) - K\delta_K(x)$$
Here I'm unsure that I'm doing everything correct by representing indicator as $\mathbb{I}_{[0,K)}(x) = 1 - \mathbb{I}_{[K,+\infty)}(x)$, but, anyway, I don't clearly understand how these derivatives were taken to obtain the similar result.
Edit: here is more information on what's going on: I'm studying the optimal stopping problem for the Black–Scholes model $$dx(t) = x(t)(\mu dt + \sigma dW(t)), $$ cost function $$v(x) = sup_{\tau \geq 0} E[e^{-\beta \tau}g(x(\tau))],$$ $\tau$ is the stopping time we seek.
$f$ and $g$ from above are call options. Here I need to study the expression (variational inequality) $$\beta g - \mathcal{L}g - \hat{f}$$ whether it is $\leq 0$ and not $\equiv 0$. Here $g$ is my option and the operator $\mathcal{L}$ is $$\mathcal{L}\varphi(x)=\mu x\varphi'(x)+\frac{1}{2}\sigma^2x^2\varphi''(x)$$ ($\hat{f}$ is not the function above and is $0$ in his example).
In class we did the following for the option $f$ from above:
$$ \beta f - \mathcal{L}f = \beta(x - K)_+ - \mu x \mathbb{I}_{[K,+\infty)}(x) - \frac{1}{2}\sigma^2x^2 \delta_K(x) \leq 0. $$
To begin with you are speaking of distributions; that would be a good tag to add to your question. In the calculus sense, $f(x)=(x-K)_+ = \max(x-K,0)$ is differentiable for $x\neq K$ and has no derivative at $x=K$, but with distributions one can give a meaning to the derivative of $f(x)$ as a distribution. An example of a book that covers distributions is Rudin's Functional Analysis, chapter 6.
$f$ and $g$ are valid distributions because they are locally integrable, i.e., the integrals of $|f|$ and $|g|$ on compact sets are finite.
$g(x)=\max(x,K)=K+(x-K)_+=K+f(x)$ as noted in a comment while I was composing this answer. You have now asked a new question, which is why $f'(x) = \mathbb{I}_{[K,+\infty)}(x)$ in the distribution sense. The short answer is that $f$ is almost everywhere differentiable and absolutely continuous, so its distribution derivative equals (the distribution corresponding to) its derivative as a function, with distributionhood allowing us to ignore the messy fact of having a point where $f$ is not differentiable. Look at Rudin 6.14. The derivative of $f$ as a function is $0$ for $x<K$, because it's uniformly zero there, and $1$ for $x>K$, because it's $x-K$ there.