I have a differential equation for function $f$ with respect to variable $t$, given by $$\begin{align} \frac{\partial f}{\partial t} &= f^2 + g(t). \tag{1} \end{align}$$ The problem is that I do not know if $f$ depends on another variable, lets call it $x$. I do know that $g$ does not depend on $x$. I also know that $f$ is no function of $x$ at the initial time $t_0$, $$\begin{align} \frac{\partial}{\partial x} \ f(t_0) &= 0\tag{2},\\ \frac{\partial}{\partial x} \ g(t) &= 0.\tag{3}\\ \end{align}$$ Based on conditions $(2)$ and $(3)$, can I conclude that $$ \frac{\partial}{\partial x}f(t;t_0) = 0, $$ where $f(t;t_0)$ is a solution of $(1)$ based on initial condition $f(t_0)$.
Intuitively I would say so, but I am unable to show it. Or is this wrong and I cannot make any statement about the dependency of $f$ with respect to $x$, based on these definitions?
The form of equation $(1)$ is just an example and not of particular interest. I am mostly interested in the claims that I make about the dependency of $f$ on variable $x$.
So I think it would help to right everything in terms of $x,t$; so you have $\frac{\partial}{\partial t}f(x,t) = f(x,t)^2 + g(t)$ where $\frac{\partial}{\partial x}f(x,t_0) = 0$. Now if we fix a certain value of $x$ and consider $f_x := f(x,\cdot)$, then this means that $f_x$ satisfies $f_x'(t) = f_x^2(t) + g(t)$ with $f_x(t_0) = y_0$ specified and independent of $x$. Therefore, if we can show that the initial value problem $y'(t) = y(t)^2 + g(t)$ with $y(t_0) = y_0$ has a unique solution on a domain $[t_0-\alpha,t_0+\alpha]$, then $f_{x_1}(t) = f_{x_2}(t)$ on that interval for any $x_1, x_2$ and so $f$ is in fact independent of $x$ for $|t-t_0| \leq \alpha$.
To make this more concrete, if we choose $g(t) = 0$ then we are left to analyze the initial value problem $y' = y^2$ for $y(t_0) = y_0$. This has a unique smooth solution $y = \frac{y_0}{1 + y_0(t_0-t)}$ defined on the interval $|t-t_0|\leq \frac{1}{y_0}$, and so you can conclude that $\frac{\partial}{\partial x}f = 0$ for exactly those values of $t$. Theorems like Picard–Lindelöf can be used to make statements of this kind for general $g(t)$ as long as it is continuous.
To see why this might break down, consider the modified problem $\frac{\partial}{\partial t}f(x,t) = 3f(x,t)^{2/3} + g(t)$ where for clarity we specify $g(t) = 0$, $t_0 = y_0 = 0$. It is well known that the IVP $y' = 3y^{2/3}$, $y(0) = 0$ has infinitely many solutions given by $$ y(t) = \begin{cases} 0 & \text{ for } t \leq C, \\ (t - C)^3 & \text{ for } t > C \end{cases} $$ where $C$ is any positive constant. From this we can see that the function $$ f(x,t) = \begin{cases} 0 & \text{ for } t \leq x^2, \\ (t - x^2)^3 & \text{ for } t > x^2 \end{cases} $$ satisfies both the equation $\frac{\partial}{\partial t}f(x,t) = 3f(x,t)^{2/3}$ everywhere in $\mathbb{R}^2$ and that $f(x,0) = 0$ for all $x$, but it is not independent of $x$ everywhere on its domain.
Thus we see that making a conclusion like the one you did relies heavily on existence/uniqueness theory for the corresponding ODE. Hopefully that clarifies things.