I have a question about the result of my numerical differentiation for $f(x) = sin(x)$ with gaussian noise. The question goes as follows and the program used is MATLAB:
The area causing my confusion is that I am not sure whether to define the function as (1) y = 0.01*randn + sin(x). I apologize for such a trivial question but I am slightly confused over my results from the former question, and the textbook I am using runs a similar problem using the noise function defined as the latter case.
So it would be nice if someone could confirm this for me. The plots for the 2 cases using both (1) and (2) are shown below. For (1), the plots are as follows:
For the latter case of y = (1+0.01*randn).*sin(x)

So in terms of adding some "noise" into the function, would the typical convention be to multiply it by the function itself or should one just follow the exact definition (which is addition in this case) ? Secondly, does such results imply that numerical differentiation techniques with any "noise" (which I believe is typical of most data in real life) is quite useless given the following results ?
Thank you.
As you describe it the exercise wants you to do (1). Although I think it is not important which kind of noise you use. Because the goal of the exercise is probably to show you that the centered difference formula fails in the presence of noise.
In general "adding noise" means all kind of ways to create noisy data from data without noise. This does not have to be adding it could be multiplication or something else, too. Usually one adds noise to modell some real problem if the error in the measurement is proportional to the measurement one multiplies. If it is independent one adds. Of course one can imagine many other more complicated ways to "add noise" and does use some but these two are the most important.