I need to build a model for camera lens deformation for a project I'm working on right now.
Do do this, I'm trying to get a gaussian distribution of multiplicative coefficients over a finite array, and what I would like to do would be to have the max (1 in my case) and min of these coefficients set, and then "gaussianly" take points between these two for the other cells of the array.
So far I got to distributing the coefficients by setting the "right" mean and deviation values, but this is definitely not the way I'd like this to work.
Also, since it's a camera lens, I then will need to translate this in 2d. My understanding is that if I do this vertically and horizzontally, if I multiply the two values I should get the right 2d-gaussian value, right? Since I'm assuming there should be no correlation between the two.
For reference, I'm leaving here the dummy code (c#) I did just so that I could continue with the rest of the project while having something remotely resembling this kind of distribution (a truncated cone).
private float DeformCameraLens(int index1, int index2, Vector3[,] projectedRaysArray)
{
int max1 = projectedRaysArray.GetLength(0);
int max2 = projectedRaysArray.GetLength(1);
float percentPerIndexPoint1 = (1 - maxQOVPercentDeformation) / (max1 / 2f);
float percentPerIndexPoint2 = (1 - maxQOVPercentDeformation) / (max2 / 2f);
float percent1 = 1 - percentPerIndexPoint1 * Mathf.Abs(index1 - (max1) / 2f);
float percent2 = 1 - percentPerIndexPoint2 * Mathf.Abs(index2 - (max2) / 2f);
return (percent1 + percent2) / 2f;
}