I shall prove the following two statements but I don't really know where to start.
(I.D.) admits a unique solution and if $f,k$ are of class $C^p$ then y is of class $C^{p+1}$, where (I.D.) is the following Problem:
Find $y \in C^1(I_0)$ s.t. $\forall t \in I_0, y'(t)=f(t,y(t)) + \int_{t_0}^t k(t,s)y(s)ds$ and $y(t_0)=y_0$ with $y_0$ given in $\mathbb{R}$, $f$ Lipschitz-continuous and $k \in C^0(I_0 \times I_0;\mathbb{R})$. In this case $I_0=[t_0, t_0 + T] \subset \mathbb{R}, T>0.$
Because I know for ODE's that if $f$ is Lipschitz, then the problem has a global unique solution. But I'm not sure if I'm allowed to use it in this particular Integro-Differential Problem. And isn't it obvious that if $f,k \in C^p \Rightarrow y \in C^{p+1}$? How can I write it down mathematically correctly?
If anyone could give me a hint, I would really appreciate it.
Thanks in advance!
First of all, notice that the sum of 2 Lipschitz functions is Lipschitz too: in general, if $f$ and $g$ are Lipschitz with constants $L_1$ and $L_2$, \begin{equation} ||f(x) + g(x) - f(y) - g(y)|| \leq ||f(x) - f(y)|| + ||g(x) - g(y)|| \leq\\ \leq L_1||x - y|| + L_2||x - y|| = (L_1 + L_2)||x - y|| \end{equation} and $f + g$ is Lipschitz with constant $L_1 + L_2$.
I will use this to prove first the uniqueness of the solution: $f$ is Lipschitz by hypothesis and, as $k$ and $y$ are at least continuous, by the Fundamental Theorem of Calculus you have that the integral of their product is derivable (hence it's Lipschitz because derivable implies Lipschitz).
Then, as $f(t,y(t)) + \int_{t_0}^t k(t,s)y(s)ds$ is the sum of a couple of Lipschitz functions, it is Lipschitz by the first result I showed you. This implies the uniqueness of the solution.
Now, $f,k\in\mathcal{C}^{p}\implies y\in C^{p+1}$ it's direct by bootstrap $\left(\right.$by Fundamental Theorem of Calculus if you want, writing $y(t) = y_0 + \int_{t_0}^tg(t,y(t))$ with $g(t,y(t)) = f(t,y(t)) + \int_{t_0}^t k(t,s)y(s)ds\left.\right)$. Moreover, there is an additional important result in ODE's theory which is the analogous with the initial condition $(t_0,y_0)$ of the Cauchy Problem. I give you an idea, but you will need to look at some variationals or ODE's book.
Property
Consider the Cauchy Problem \begin{equation} \begin{cases} \dot{x} = f(t,x)\\ x(t_0) = x_0 \end{cases}, \end{equation} where $f\in\mathcal{C}^r(\Omega)$, $r\geq 1$ and $\Omega$ is the open domain of $f$.
Then, the solution $\phi(t;t_0,x_0)$ admit derivatives until $r$ order with respect to $(t_0,x_0)$ and they are continuous with respect to $(t;t_0,x_0)$ in $\Omega$.
Proof
Induction on $r$:
$r = 1$: Look at the theorem "differentiability of solutions with respect to initial conditions and parameters", you can find it in any book of ODE's.
$r>1$
Assume it's true from $1$ to $r-1$. Then, as we have \begin{equation} \begin{cases} \dot{x} = f(t,x)\in\mathcal{C}^r(\Omega)\\ x(t_0) = x_0\\ \frac{d}{dt}\frac{\partial \phi}{\partial x_0}(t;t_0,x_0) = D_xf(t,\phi(t;t_0,x_0))\frac{\partial\phi}{\partial x_0}(t;t_0,x_0)\\ \frac{\partial \phi}{\partial x_0} = Id \end{cases}, \end{equation} the induction hypothesis that $D_xf(t,\phi(t;t_0,x_0))\in\mathcal{C}^{r-1}(\Omega)$ implies that $\partial \phi / \partial x_0$ is differentiable $r-1$ times with respect to $(t_0,x_0)$, so $\phi$ is $r$ times differentiable with respect to $(t_0,x_0)$. This is because the las two equations form a new Cauchy Problem, etc. The same is done for $t_0$. This is called the first variational equation with respect to $x_0$ and you must look to some reference to understand what we are talking about (maybe you already know what I am talking about).
Well...that's all! I hope to help you, excuse me for the length of the answer.