I have the following equation: $\int_{\alpha_0}^{\alpha_1} g(\alpha)f( g(\alpha) \cdot x) d\alpha = h(x)$
where $g(\alpha)$ and $h(x)$ are both given fucntions. Specifically, $g(\alpha)=cos(\theta - \alpha)$ where $\theta$ is a constant, and $h(x)$ is only known numerically.
How can I find $f(x)$ analytically or numerically?
UPDATE 2 (analytical solution): The analytical solution can be obtained by transforming the equation into the generalized Abel's integral equation (see Wazwaz AM. (2011) Abel’s Integral Equation and Singular Integral Equations. In: Linear and Nonlinear Integral Equations. Springer, Berlin, Heidelberg) [1].
For the sake of clarity, let's define $\Delta \equiv \theta-\alpha$:
$$\int_{\Delta_0}^{\Delta_1} cos(\Delta)f( x \cdot cos(\Delta) ) d\Delta = h(x)$$
It is worth pointing out now that in my problem, $\Delta \in (-\pi/2, \pi/2]$. In that case, due to the symmetry of the integrand, we can write
$$ 2 \int_{0}^{\pi/2} cos(\Delta)f( x \cdot cos(\Delta) ) d\Delta = h(x)$$
Let's now make the change $u \equiv x cos(\Delta)$, in which case $d\Delta = \frac{-1}{2x \sqrt{1-(u/x)^2}}du$ and
$$ \int_{u=0}^{u=x} \frac{u}{\sqrt{x^2-u^2}}f(u) du = xh(x)$$
If we call $\tilde{f}(u) = u f(u)$ and $\tilde{h}(u) = xh(x)$ the equation becomes
$$ \int_{u=0}^{u=x} \frac{ \tilde{f}(u)}{\sqrt{x^2-u^2}} du = \tilde{h}(x)$$
which is the generalized Abel's integral eqution with kernel $K(x,u)=(g(x)-g(u))^{-\nu}$ with $g(x)=x^2$ and $\nu=1/2$. The solution is obtained in [1] and is given by
$$ \tilde{f}(u) = \frac{sin(\nu \pi)}{\pi} \frac{d}{du} \int_{0}^{u} \frac{g^{\prime}(t) \tilde{h}(t)}{ (g(u)-g(t))^{1-\nu} } dt$$
The conditions are that $0 < \nu < 1$ and that $g(x)$ is strictly monotonically growing and differencitable within some interval $0 < x < c$, with $g^{\prime}(x) \neq 0$ for every $x$ in the interval. This is the case singe $g(x)=x^2$ and $\nu = 1/2$.
UPDATE (numerical solution): Following @Botond suggestion, we can do the following:
$$f(x) = \sum_n a_n x^n$$
and substituting in the integral equation
$$\sum_n a_n x^n \int_{\alpha_0}^{\alpha_1} g(\alpha)^{n+1} d\alpha = h(x)$$
Now we consider a set of points $x_j$ in which $h(x_j)$ is known, then we can write
$$\sum_n a_n x_j^n \int_{\alpha_0}^{\alpha_1} g(\alpha)^{n+1} d\alpha = h(x_j), \quad j=1,...,N$$
or, summing over repeated indices,
$$ x_j^n b_n a_n = h_j$$
The components of the vector $b$ are computed from $b_n = \int_{\alpha_0}^{\alpha_1} g(\alpha)^{n+1} d\alpha$ (in order to solve the problem as a system of equations, we need to calculate the integral for at least $n=1,...,N$ where N is the number of interpolating points $x_j$).
If we define $C_{jn} = x_j^n b_n$, then the vector $A$ with $A_n = a_n$ can be obtained as
$$ A = C^{-1}H $$
where $H$ is the vector with the values $h_j$ (if the values of the integral for $b_n$ are $0$, some values of $C^{-1}$ will diverge, which might be problematic. In this case, since $g(\alpha)=cos(\theta-\alpha)$ we need to take care of the integration limits so that the integral does not vanish within that interval). Knowing the vector $A$ with the coefficients $a_n$, we can calculate the value of $f$ at any point $x$.
Suppose $x\in\mathbb{R}$, let us do first a change variable: \begin{align*} \int_{\alpha_0}^{\alpha_1}g(\alpha)f(g(\alpha)x)d\alpha &\overset{u =g(\alpha)x}{=} \int_{g(\alpha_0)x}^{g(\alpha_1)x}g\left(g^{-1}\left(\frac{u}{x}\right)\right)\frac{f(u)}{g'\left(g^{-1}(\frac{u}{x})x\right)}du \\ &= \int_{g(\alpha_0)x}^{g(\alpha_1)x}\frac{u}{x}\frac{f(u)}{g'\left(g^{-1}(\frac{u}{x})x\right)}du \\ &= \int_{g(\alpha_0)x}^{g(\alpha_1)x}\frac{u}{xg'\left(g^{-1}(\frac{u}{x})x\right)}f(u)du \end{align*} Let's name some variables to ease the notation : $A(x) = g(\alpha_0)x$, $B(x) = g(\alpha_2)x$ and $m(x,u) = \frac{u}{xg'\left(g^{-1}(\frac{u}{x})x\right)}$. At this stage, we can use different numerical methods to find $f$.
I propose the following : $f$ can be approximated by a polynomial approximation, i.e. $f(u) = \sum_{i=0}^n a_i u^i$. Therefore, we have: \begin{align*} \sum_{i=0}^n a_i \underbrace{\int_{A(x)}^{B(x)}m(x,u)u^idu}_{I(x,i)} = h(x) \end{align*} As per the post above, we only know $h$ for a given set of $\{x_k\}_{k\in\mathbb{N}}$. Hence, the above equation can be written in a matrix-vector form: \begin{align*} \begin{pmatrix} I(x_0,0) & \dots & I(x_0,0) \\ I(x_1,1) & \dots & I(x_1,1) \\ \vdots & \dots & \vdots \\ I(x_k,n) & \dots & I(x_k,n) & \end{pmatrix}\begin{pmatrix} a_0\\ a_1\\ \vdots\\ a_n \end{pmatrix} =\begin{pmatrix} h(x_0)\\ h(x_1)\\ \vdots\\ h(x_k) \end{pmatrix} \end{align*} Then by inverting the matrix, we can find the polynomial weights. Note that you have the choice on the numerical integration method to compute $I$.