Help on implementing Optimal Interpolation algorithm

143 Views Asked by At

I am trying to implement an algorithm known as Optimal Interpolation via Python code. It is used to form an gridded dataset through assimilating point observations onto a background field. Here are a few resources:

I believe I have computed the necessary ingredients but I cannot figure out the maths behind how to fit all the ingredients together. I am trying to assimilate point station data onto a background field from satellite estimates. This is what I have calculated.

  • Background error variance field (background field subtract a 'truth field', squared, and then averaged over time)
  • Observation error variance field (Obtained from background error variance field multiplied by an empirical constant, Rz. This process was based on page 113 of Daley, 1991.)
  • Vector of correlations of analysis point to stations (computed based on the Gaussian model for correlation (equation 3.47 in the first link), using distances of analysis point to each station).
  • Increment/innovation vector (vector of station values subtract background field values at each station)

What I am confused about, is how do I get the background error covariance matrix and the observational error covariance matrix from these ingredients? I figure it has something to do with the vector of correlations of analysis point to station but I don't know what to do with it.

If I have these two matrices, then what is the next step? I gather it has something to do with the inverse of the sum of the background error covariance matrix and the observation error covariance matrix? In link #3, I need the estimated data covariance vector and the data covariance matrix.

analysis point equation

Even a simple worked 3 x 3 matrix case would be valuable. Thank you.