Originally posted in GIS, but not sure if it belongs there.
Given a starting latitude, longitude and altitude, and a line of sight defined by azimuth and elevation, I want to find the latitude and longitude (assuming spherical Earth, WGS84 or other) where the line of sight pierces a given altitude.
Background:
I have a set of data points from an all-sky camera (i.e. a wide-angle lens pointing at the sky). The data show auroral emissions. I have elevation and azimuth for each pixel (each array element), and I know the coordinates (latitude, longitude, altitude) of the camera. Based on the assumption that the auroral emissions are coming from a given altitude above ground (e.g. 150 km), I want to show the data on a map.
Some notes regarding implementation: I aim to implement the calculation in Python (using numpy), and if possible would like an efficient way of doing this (i.e. preferably not element-by-element), since the data arrays contain upwards of 200k pixels. The distance between pixels are on the order of a few km, which should give an idea of the precision needed.