How to show that there for small enough $x>0$ the following function is decreasing in $x$ \begin{align} g(x)=\frac{1}{x}f \left( \frac{(1-x)A+xB}{ (1-x)C+xD} \right) \end{align} where $f>0$ and $f$ is strictly convex and where $A,B,C,D$ are some real non-zero constants.
When I do Laurent series expansion of the derivative of $g(x)$ around $x=0$ we get that \begin{align} g^\prime(x)= -\frac{g(A)}{x^2} + c+O(x) \end{align} where $c$ is some constnt (see here)
Clearly for small enough $x$ the above object is negative since $g(A)>0$ for any $A$.
Can someone show me a possibly simpler proof or different proof?
Well, if $f>0$ then $\lim_{x\to 0^+}g(x) = +\infty$, so the only way your proposal would fail is if $g$ wiggled near $0^+$, which would mean $g'=0$ infinitely often near $0^+$. Letting
$$h(x) = \frac{(1-x)A+xB}{(1-x)C+xD}$$
we get that
$$h'(x) = \frac{BC-AD}{{\big(Dx-C(x-1)\big)}^2},$$
so $h'$ has fixed sign, but more importantly is continuous and bounded near $x=0$. Then
$$g'(x) = -\frac1{x^2}f(h(x)) + \frac1x f'(h(x))\cdot h'(x),$$
so that $g(x)'=0$ for $x>0$ if and only if
$$f'(h(x))\cdot h'(x) = \frac{f(h(x))}{x}.$$
As $x\to 0^+$, the RHS goes to $+\infty$, but the LHS remains bounded, so equality near $0^+$ cannot occur infinitely often. Therefore, $g$ does not wiggle near $0^+$, and must hence be decreasing near $0^+$.
Notice that we did not use $f$'s strict convexity. We used only the fact that $f$ is continuously differentiable and positive near $h(0)$. Of course, we need $C\neq 0$ in order for $h'(0)$ to be well defined, but otherwise no conditions on the constants are needed either.