Engineering a soliton in an advection diffusion equation

110 Views Asked by At

I have a standard advection diffusion equation $$\dot\psi = \nabla \cdot (\psi u) + \epsilon \Delta \psi$$ where $u$ is a vector field (doesn't change in time), $\psi$ is the advected density, and $\epsilon$ is the rate of diffusion. In this form, if I initialize $\psi$ to a low variance gaussian, it will diffuse away into nothingness. Is there some modification to the equation such that the low variance gaussian becomes a soliton of the modified pde? $$\dot\psi = \nabla \cdot (\psi u) + \epsilon \Delta \psi + R(\psi,u)$$ I.e. how do I choose $R(\psi,u)$ so that an initial delta function will only diffuse a little bit, but after that will more or less just advect?

$u$ isn't necessarily divergence free, but if that makes it easier, I'm happy to assume that.

1

There are 1 best solutions below

2
On BEST ANSWER

The original PDE is lacking nonlinearity to have solitary wave solutions. Thus we expect $R$ to be nonlinear in $\psi$. For sake of simplicity, let's assume $u$ constant. We make the traveling wave Ansatz $\psi(x,t)=\psi(\xi)$ with $\xi = x-ct$. Thus, we have $$ \epsilon \psi'' + (c+u)\psi' + R(\psi,u) = 0 . $$ With $c=-u$ and relevant choices of quadratic functions for $R$, solitary wave solutions are obtained (see this link, Sec. Soliton Solutions). For instance, the expression $R(\psi, u) = \epsilon\, (u\psi - 3\psi^2)$ would be appropriate.