How to remove divergence from any vector field?

662 Views Asked by At

I am trying to simulate fluids in 2D. I have a vector field which represents the flux; a continuous function that maps 2D positions to 2D velocity. I do some simulation steps and the flux changes. The new flux may have non-zero divergence, which is undesired, because I want to simulate non-compressible fluid. Therefore I want to have zero divergence at all points by changing the flux as little as possible. How can I calculate this most optimally and accurately?

My failed attempts:

  • I found Helmholtz decomposition which can be useful, but I don't think it tries to stay as close as possible to the original vector field with non-zero divergence.

  • I could not get this to work. The formula there does not even make sense to me.

1

There are 1 best solutions below

0
On BEST ANSWER

The solution that worked for me has been Hodge decomposition.

I have found exactly what I was looking for here. It even has an example implementation.

In hindsight, I should have asked this question in gamedev. What I have been trying to achieve was similar to plasma pong.