I want to solve the following ODE:
$$\frac{d^2}{dx^2}X(x) + \alpha^2 X(x) = 0$$ $$ X(0) = X(L) $$ $$ \frac{d}{dx} X(0) = \frac{d}{dx}X(L) $$
So I know the general solution is given by:
$$ X(x) = c \cdot \text{cos}(\alpha x) + d \cdot \text{sin}(\alpha x) $$
But I am having trouble with the boundary conditions. The conditions imply:
- $ c = c \cdot \text{cos}(\alpha L) + d \cdot \text{sin}(\alpha L)$
- $ \alpha d = \alpha d \cdot \text{cos}(\alpha L) - c\alpha \cdot \text{sin}(\alpha L)$
If I add these equations, I get:
$$ c + \alpha d = (c + \alpha d) \cdot \text{cos}(\alpha L) + (d - \alpha c) \cdot \text{sin}(\alpha L) $$
Now, I can divide by $c + \alpha d$, and if force $\frac{d - \alpha c}{c + \alpha d} = 1$ I can get $1 = cos(\alpha L) + sin(\alpha L)$, whose RHS looks like this:
Which definitely has periodic solutions $\alpha_n$. But somehow this doesn't seem right to me.
Question:
How do you solve this problem?
Okay so I messed it up. The equation really is:
$$ c + d = (c+d) \cdot \text{cos}(\alpha L) + (d-c) \cdot \text{sin}(\alpha L) $$
By the same token then, if we divide by $c+d$, we get the condition that:
$$ \frac{d-c}{c+d} = 1 \; \; \rightarrow \; \; c = 0 $$
This reduces the boundary conditions to:
$$ 0 = d \cdot \text{sin}(\alpha L) $$ $$ d = d \cdot \text{cos}(\alpha L) $$
So it must be that $\alpha_{n} = \frac{2 \pi n}{L}$ for $n=0,1,2,...$
What do you think?
Do I need to set c = 0 here? Also, I seems we still have the other $\alpha$ solutions since we still get the $1 = cos() + sin()$ solution...
Edit: Well if $c \neq 0$, then the solution $\alpha_{n} = \frac{2 \pi n}{L}$ still holds, despite $\frac{d-c}{c+d} \neq 1$ because then $sin()$ is identically zero.
Edit: Well the roots of $1 = cos(x) + sin(x)$ is actually $x = 2 \pi n$ according to https://www.wolframalpha.com/input?i=find+roots+of+1+%3D+cos%28x%29+%2B+sin%28x%29 so I guess the solution holds in all cases.
