I am trying to solve the following differential equation:
$$ u^{\prime \prime} = -\frac{F}{EA}\delta(x-L) $$
subjected to the boundary conditions:
$$ u(0) = 0 \qquad \frac{du}{dx} \biggr\vert_{(x=L)} = 0 $$
I use the following famaous identity:
$$ \int_{-\infty}^\infty f(x)\delta(x-x_i)dx = f(x_i)$$
Hence, first integration gives:
$$ u^\prime (x) - u^\prime (0) = -\int_0^L \frac{F}{EA}\delta(x-L) dx $$
Then,
$$ u(x) - u(0) - u^\prime (0) x = -\int_0^L \int_0^L \frac{F}{EA}\delta(x-L) dx dx$$
Applying the boundary conditions:
$$ u^\prime (L) - u^\prime(0) = -\frac{F}{EA} $$
gives
$$ u^\prime(0) = \frac{F}{EA} $$
Substituting into the equation gives the total solution:
$$ u(x) -\frac{F}{EA} x = -\int_0^L \int_0^L \frac{F}{EA}\delta(x-L) dx dx $$
which gives at
$$ u(L) = 0 $$
Yet, I know that this is certainly not the solution. I also encounter same difficulty when the domain of integration is $ (0,L) $ but the Dirac Delta is at somewhere between, say, $ \delta(x-0.5L) $. What part do I miss here, can someone help me? Thanks.
Assuming that the boundary condition $u'(L) = 0$ should be $u'(L+) = 0$.
Integrating the equation: $$\begin{align} u'' &= -\frac{F}{EA} \delta(x-L), \\ u' &= -\frac{F}{EA} H(x-L) + A, \\ u &= -\frac{F}{EA} (x-L) \, H(x-L) + Ax + B, \end{align}$$
Using boundary conditions: $$ 0 = u(0) = -\frac{F}{EA} (0-L) \, H(0-L) + A \cdot 0 + B = B, \\ 0 = u'(L+) = -\frac{F}{EA} H((L+)-L) + A = -\frac{F}{EA} + A, $$ i.e. $$A = \frac{F}{EA}, \quad B = 0.$$
Thus, $$ u(x) = -\frac{F}{EA} (x-L) \, H(x-L) + \frac{F}{EA}x = \frac{F}{EA} \left( x - (x-L) \, H(x-L) \right) = \begin{cases} \frac{F}{EA} x, & (x \leq L) \\ \frac{F}{EA} L, & (x \geq L) \end{cases}. $$