The question is simple. Could analog computers (for example, electrical analog computers) in effect avoid traditional numerical instability problems that come from solving a set of PDEs over a discretized domain? Progress on significant physical problems (the computational aspects at least) are often substantially delayed because of time spent figuring out how to avoiding numerical instabilities, and often the solution ultimately utilized is just to hit the problem with lots of artificial diffusion, which seems unrigorous at best.
Also, since many physical theories are often significant approximations themselves, maybe some accuracy could be sacrificed (in moving to analog computers) for the greater benefit of avoiding numerical instabilities.