I am trying to solve the following numerical analysis problem and I need some experts' feedback on my assumptions.
I have the following statement:
Consider $F:R^n \rightarrow R^n$, a system of differential and algebraic equations: $F(x(t),t)=I(x)+\dot{Q}(x)+S(t)$.
Where $x(t)$ is the solution of $F(x(t),t)=0$ for $(t\in [0,T])$. $I$ and $Q$ are algebraic nonlinear functions of $x$, and $S$ is an explicit function of $t$ (stimuli of the system).
Additional information:
$\dot{Q}$ is the time derivative of $Q$.
$F, x, I , Q$ and $S$ $\in$ $R^n$.
$n$ the size of the system (the number of equations in $F$ or the number of unknowns in x) can be very big (up to several millions).
$R$ and $Q$ are $C1$ functions (differentiable with continuous derivatives).
We consider that the system has a solution that is unique.
$F$ is a stiff system (large ratio in time constants).
Propose a method to compute numerical approximation of the solution $x(t)$ of this system $F(x(t), t)=0$ for $(t\in [0,T])$.
In order to solve the problem I have the following assumptions:
This is a : nonlinear, multivariable first order ODE.
Is my assumption correct? I am actually not sure if it's an ODE or PDE, we have two independent variables $x$ and $t$ but one derivative with respect to $t$ that is $\dot{Q}$
If yes I assume this problem should be solved using Newton's method or Fixed point method right?
A comprehensive answer on how to solve such problem is very much needed for me.
Thanks a bunch