Let's say that I have a discrete state space model with delay:
$$x(k+1) = A_dx(k)+B_du(k-L)\\y(k)= C_dx(k) +D_du(k-L) $$
Can I insted just remove delay $L$ from $B_du(k-L)$ and $D_du(k-L)$ and simulate without delay.
Then when I need to plot $y(k)$ against discrete time $t_d(k) $ , can I insted just add delay $L$ to $t_d(k)$ so it becomes $t_d(k) + L$.
Or is this not correct?
You can only do this if the input $u(k)$ is not a function of the state $x(k)$ or output $y(k)$. So the input should only be a function of time.
Also in order to get the same transient at the start when using a certain initial state $x(0)$ you would have to add $L$ zeros in front of the desired $u(t)$. If your system is stable, has fast dynamics relative to the planned simulated time and you are not interested in this start up transient, you could omit the zeros.