I must reconstruct the input signal to a system, knowing the output signal and the system transfer function. At the end, I found that the Laplace-Transform of the input signal is something like: $$ s^2\sum_{k=0}^\infty c_k e^{-skT} $$ The coefficients $c_k$ are known (from the output signal) and T is a known constant.
The sum $ \sum_{k=0}^\infty c_k e^{-skT} $ corresponds to a series of Dirac-deltas and this is ok. The $s^2$ means that those deltas must be differentiated twice. The Dirac-delta is a distribution and, given a test function f(t), we have that $<\delta'',f(x)> = f''(0)$. But how do I interpret this for my input signal ?
The only idea I have is that I could consider the delta as the limit of a Gaussian function and the delta's 2nd derivative as the limit of the Gaussian function's 2nd derivative. But I've never seen something like this anywhere, so I'm skeptikal.
You are right in that the Laplace antitransform of $F(t)=s^2 a \, e^{-s b}$ is $f(t)=a \delta''(t-b)$, the second derivative of Dirac delta with origin at $b$ and weighted by $a$ (related). I'm not sure how do you expect to "interpret this for my input signal". Even Dirac deltas are difficult to interpret as input signals, how do make sense of them (probably as limits) depends on your scenario.
It's a little easier (and, I think, more common) to interpret these kind of things as transformations of signals than as signals themselves (say, as impulsive response of LTI filters). Then, if we have $y(t) = x(t) \star h(t)$ (convolution) and $h(t)$ (filter impulsive response) consists of a Dirac delta $h(t)=a \delta(t-b)$, that is easy to interpret as "what this filter does is: delay the input by $b$ and weight it by $a$".
In the same vein, if we have $h(t)=a \delta''(t-b)$ we don't try to interpret this by picturing this strange "function" as a graph (good luck with that) or even as a limit of nicer functions, but simply in operational terms: $h(t)\equiv$"differentiate the input twice, delay it by $b$ and weight it by $a$"