Back in the day, you'd measure noise with a true RMS meter and specified bandpass, and that was that.
Now, with DSOs, it's still the same. RMS is done by sampling and integration, which comes from the definition. And the bandwidth is Fs/2 ... or is it? Because after all, sampling mixes all frequencies together, true whether you're looking at an oversamped or equivalent time acquisition (but not with "high res" mode, which is digital antialiasing).
To make things worse, DSOs calculate RMS over a fixed interval (the buffer memory, or what's on the display, or between cursors), which won't necessarily contain the lowest frequencies in the (analog) bandwidth.
What's the correct answer?
My experimental observation: a given amplifier measures 1.5mV RMS at pretty much any sample rate. If this is total noise (in the analog bandwidth), then for a gain of 10 and BW about 100MHz (confirmed through other measurements), this gives input referred e_n = 5.8nV/rtHz. This suggests sample BW = analog BW, regardless of sample rate. At least at the high end.
Tim