I'm struggling with the following question.
Construct leading-order inner and outer solutions to: $$ \epsilon u''(x) + u'(x) = \frac{u(x) + u(x)^3}{1 + 3u(x)^2}, $$ $$ 0<x<1, $$ $$ u(0)=0, $$ $$ u(1)=1, $$ where $ 0< \epsilon <<1 $. [You will only be able to determine the outer solution implicitly.]
Here's what I've done so far, though I'm not fully confident it's all correct.
Boundary layer is at $x=0$.
Outer expansion $u_0(x)$ satisfies: $$ u_0'(x) = \frac{u_0(x) + u_0(x)^3}{1 + 3u_0(x)^2}. $$ This gives solution $$ ln(u_0(x)(1+u_0(x)^2))=x+C, $$ where C is a constant of integration. Plugging in boundary condition $u_0(1)=1$ gives $C=1+\ln2$, hence our outer expansion satisfies $$ u_0(x)(1+u_0(x)^2) = 2e^{x+1} $$
Inner expansion $U_0(x)$ satisfies $$ U_0''(x) + U_0(x) = 0. $$ This gives solution $$ U_0(x) = A + Be^{-x}, $$ where A and B are constants of integration. Applying the other boundary condition $U_0(0)=0$ gives $A+B=0$, so $B=-A$, and hence we have inner expansion $$ U_0(x) = A(1 - e^{-x}). $$
This is where I don't know where to continue. To find A, I need to use matching, i.e. set the inner limit of the outer solution equal to the outer limit of the inner solution. The inner limit of the outer solution is: $$ \lim_{x \to ∞}U_0(x) = \lim_{x \to ∞}A(1-e^{-x}) = A. $$
The outer limit of the inner solution, $\lim_{x \to 0}u_0(x)$, satisfies: $$ \lim_{x \to 0}u_0(x)(1 + u_0^2(x)) = 2e. $$
Do I then just set $A + A^3 = 2e$? I assume not, as this doesn't have a neat solution. Perhaps I went wrong further back?
Thank you so much in advance for any help!