I have a game where there is a fixed camera pointing at a 2D plain
It has been designed to fit perfectly for a 1920 x 1080 resolution.
If the users chosen display ratio changes I want the 2D plain to maintain its aspect ratio so that there will be black bars in areas where the field of view needs to capture more than just the plain.
My approach was I used the base aspect ratio as the benchmark and I tried using an aspect ratio differential as a multiplier:
double _rootAspectRatio = 1.777885; // (1920 x 1080)
double _rootFieldOfView = 55;
double aspectRatio = (double)Screen.width / (double)Screen.height;
double fieldOfViewPerAspectRatio = _rootFieldOfView / _rootAspectRatio;
double aspectRatioDifference = _rootAspectRatio - aspectRatio;
_mainCamera.fieldOfView += (float)(aspectRatioDifference * fieldOfViewPerAspectRatio);
This worked for a good chunk of resolutions but then it failed in some others e.g. 2436 x 1125 (in this case the camera is way to close).
Can you advise me a better way to calculate what the field of view should be when changing resolutions knowing that I want it to preserve the aspect ratio of 1920 x 1080?
Thank you.

I found a solution: