I'm working on a project in which I need to calculate some noise parameters of a signal in order to give an idea how the system noise changes in the time: in short, the user should measure this parameters every week, in order to see if the noise has changed.
Well, I have the signal in Volt, during the noise measurement the output would be a constant value (i.e. a read for all the time of the measure a constant value i.e. 100 mV +/- the noise in the range of 0.1 mV), then this signal is sampled by a 24 bit A/D, then I get the signal in my code.
My problem is that I do not know exactly which parameter I have to calculate in order to give to the user an idea of the increasing/decreasing noise on the systems.
As first I thought a simple peak-peak difference in the measuring interval and the value of signal average. But then, which other parameter could I calculate? S/N would make no sense, because I have no signal, just a constant value
Any suggestion?
Thanks for your reply,.