How do I know what the error is in measurements I took using an oscilloscope? In the image below you will see on channel #1 of the oscilloscope (yellow) there is a pattern. I manually measured the time (delta t) it took to fall from the maximum to the minimum point. I used a selection tool that comes with the oscilloscope but its accuracy depends on the user's selection points.
How do I estimate my error?

The time duration you're looking for appears to be the same as the time duration that the 4th signal is LOW. You can use the cursor to see how much time it takes to go from HIGH to LOW on the 4th channel to find your error bounds. So if your cursor moves in .02ms increments, and you find that the signal is HIGH at time $t$ and LOW at time $t+.04$, then the drop started somewhere in that .04ms time period. Thus, your error for the initial time is at most .04ms.