I have got a video in grayscale, and I am interested to find out if the intensity of a pixel at a given location is more or less constant over time. I have been told here that I can use linear regression first to find the formula $y = at + b$, and then use statistical hypothesis test to find out if the null hypothesis $a = 0$ should be rejected. If it is rejected, then the pixel intensity is not constant over time.
I believe during the test, I can safely assume a normal distribution of the value $a$. However, while the mean is $0$ (according to the null hypothesis), what should be the standard deviation? Is there any general method to determine it, or do I need other knowledge to determine it?
See the numerical example of Simple Linear regression at this Website (https://en.wikipedia.org/wiki/Simple_linear_regression) and follow exactly the procedure where your t's are their x's and you a in the model $y=at+b$ is their $\beta$. Find the confidence interval of "a"($\beta$) and find if 0 is contained in it. If it contains 0 accept that the hypothesis that pixel intensity is constant over time and if it does not contain 0 then conclude that pixel intensity is not constant over time.