I am looking for ways to calculate signal-to-noise ratio (SNR). As I understand it, this measure is often used when you have a separated clean signal and noisy signal, and can thus measure the power in order to get the SNR. The formula being $$ \text{SNR} = \frac{P_{signal}}{P_{noise}}. $$ This is pretty straightforward in the case of having a clean signal and having the noise, separated.
What I am after is in the case of not having this separation, but just a noisy signal. What I do have is the actual noise. Basically two signals being:
- The main signal, i.e. a noisy signal which I want to denoise
- A signal made up of only noise (coming from lab equipment etc.)
Is there any good way of calculating a similar noise measurement as the SNR? I am denoising with wavelets and I am having a hard time actually evaluating wether the denoising was successful or not when it comes to this data. Also, I am basing the best result of SNR which at the moment is pretty unsuccessful.
For simulated data it works great, though, using the SNR definition in the formula above, reason being I have the two separated channels.
Hopefully I have explained it good enough so it is understandable. :)
Any help is appreciated.
(extended comment)
You are effectively dealing with a parameter estimation problem: Estimate the parameter 'SNR' given the observed (noisy) signal. Other than that, nothing more can be said from your description. A decent estimator would consider aspects such as (main) signal properties and noise distribution (if available). It seems that you have this information to some extent (you claim you can obtain a denoised version of the signal), which could lead to an SNR estimator such as (denoised signal)/(noisy signal - denoised signal). How good this estimate is can only be statistically described and you would need to use some signal and noise model for simulation and/or analytical work.