I am attempting to numerically simulate a specific physical model, and I have obtained the system of equations using the Lagrange method. I'm not sure if this question is better suited for the Physics Stack Exchange, but since it involves topics related to systems of equations and numerical methods, I preferred to start here. The system of equations is something like this:
$$ \sin(\beta_2(t))\ddot{\gamma}(t)+\sin(\beta_1(t))\dot{\beta}_1(t)\dot{r}(t)-l_y\ddot{r}(t) = 0$$
$$ \cos(\beta_1(t))\ddot{\beta}_1(t)-\dot{\beta}_1(t)\ddot{r}(t)-l_y\ddot{\theta}(t) = 0$$
I've provided a simplified version as the original equations are lengthy. Essentially, the variables involved are $\theta, \beta_1, \beta_2, \gamma$ and $r$. This system incorporates first and second derivatives of time, coupling between them and products involving sine and cosine of the original variables.
In class, I have covered Runge-Kutta methods for solving differential equations of this type:
$$ \dot{\mathbf{y}} = \mathbf{f}(t,\mathbf{y})$$
$$ \mathbf{y}(t_0) = \mathbf{y}_0 $$
In control system problems, I have indeed managed to reduce second-order differential equations to first order by introducing additional variables. However, in those instances, there were no products between the variables themselves. Also, I would like to keep the original system of equations in the standard form of first-order differential equations if possible.
For now, all I would like is to simulate the nonlinear system. I do have plans to linearize it later on, but at the moment, I am unsure how to translate these equations into the standard form of an ODE.