Error of % + digits?

I just bought an amp clamp meter, and it states the error is "+/- 1.9% + 3 digits". What does the "3 digits" part mean?

Reply to
Commander Kinsey
Loading thread data ...

Answering my own question, I found this page, it means aswell as the percentage error, the last digit (eg the 2 in 147.2V) can vary by 3.:

formatting link

Reply to
Commander Kinsey

If your meter should read, say 1.875 A, the correct reading could be anywhere from 1.872 to 1.878. This is a possible error in the display presented to you in the analog-digital display conversion process. The +/-1.9% possible error is about the measurement taken including - but not only - any error made by the sensor.

To put it another way: If the actual current is 1.875 A, inaccuracies in the sensor and associated circuits may process it as somewhere between 1.875 A +/-1.9%. The analog-digital process may introduce a further error of +/- 3 counts in the least significant display digit. Therefore a current of 1.875 A may be displayed as anywhere from 1.836 to 1.913 A.

Reply to
Pimpom

Thanks, I wonder why all my other meters only list a % error. Is it included within it somehow, or are they just lying, or do some meters not have this error?

Reply to
Commander Kinsey

Most that use a digital meter should know the last digit is not accurate because of a rounding error. Say it shows 1.5 volts. It could be 1.45 to 1.55 or so and still show 1.5. Some meters such as the one under discussion is less accurate and can be 3 numbers high or low on the last digit. That is why on digital meters you should try to use a range that shows as many digits as you can.

-My several hundred dollar Fluke meter shows DC volts to be .05 % +- 1 digit.

Reply to
Ralph Mowery

Yes, but I was surprised to see up to 7 digits out on this one, depending on the range. I think DC amps is 3 or 5 dependant on range, and AC amps is 5 or 7.

The meters I have are not several hundred dollars, so are you saying they're only +/- 1 digit? Is the error much higher on the one under discussion because it's a clamp meter?

Reply to
Commander Kinsey

The larger error is because of the price difference. It costs more to make a part that is .01 % than it does to make one that is 2 %. The .01% parts may just be the 2 % ones that are hand sorted to .01%.

I am sure that the clamp part does play some part in how accurate the meter is.

Reply to
Ralph Mowery

- 1

they're only +/- 1 digit? Is the error much higher on the one under dis= cussion because it's a clamp meter?

But what I'm surprised at is a =A35 multimeter (not clamp) not giving a = digits error. Maybe precision on a simple voltmeter is cheap as chips n= owadays?

Reply to
Commander Kinsey

You have to be careful how you throw precision and accurecy around.

A meter that shows 4 digits is more precice than one that shows only 3 digits, however the 4 digit one may only be 1% accurate and the 3 digit one may be .5% accurate.

It is easy to get precision, but difficule to be accurate. Think of it as shooting a gun. Precision may be how close the bullets land to each other where ever they land on the target, but to be accurate the bullets have to land on the center of the target. Such as all the bullets could land very close to each other, but not even hit the target.

As I mentioned, a good meter will not have a digits error outside the +- one digit due to rounding.

Reply to
Ralph Mowery

a digits error. Maybe precision on a simple voltmeter is cheap as chip= s nowadays?

t
t
h

ts

ld

+-

That didn't help. I interchange the two. I just want to know how close= to the correct reading the readout is. Adding another digit doesn't im= prove anything if it's incorrect. And shooting all the bullets in one p= lace doesn't help if they all miss.

Reply to
Commander Kinsey

Take pi as an example. It can be said that 3.14 is accurate as a three-digit value, but 3.1416 is more precise because it has a higher resolution.

OTOH, deriving it from 22/7 or 3.1429 has the same 5-digit resolution and is just as precise as far as the number it represents is concerned but is less accurate.

In this particular case, 3.1416 is both more precise and more accurate than 3.14 but that's not always the case with measurements.

My mechanical slide caliper has a resolution of 0.001 inch. This means that it can display measurements with a precision of 1 mil, but that doesn't guarantee that a measurement taken with it will be accurate to 1 mil. I may not always press the jaws snugly enough and the scale may not be perfectly accurate.

Reply to
Pimpom

..

ng a digits error. Maybe precision on a simple voltmeter is cheap as ch= ips nowadays?

3

git

it

ach

lets

ould

e +-

ose to the correct reading the readout is. Adding another digit doesn't= improve anything if it's incorrect. And shooting all the bullets in on= e place doesn't help if they all miss.

I'd need to contract OCD to understand that. There's only one thing in = question here, how close is the reading to the correct value. You can't= split that into two. 3.1416 is better than 3.14, and that's it. All y= ou can state with a reading is it's correct to within a certain percenta= ge.

Reply to
Commander Kinsey

Try this.

A doctor does a very complicated operation on your left arm like a joint replacement. It all goes very well. Very precise.

However he should have done the operation on the right arm that was causing trouble. Not accurate.

That is why a voltmeter can show 3 digits and be accurate to only the last digit being in question by one number either way, but a 5 digit volt meter can show many numbers, but if it is not calibrated corrctly the 2nd digit to the 5 th digit could be way off and the meter not accurate at all.

Reply to
Ralph Mowery

Nope, because the first one is 100% useless. I wouldn't call that precise at all, as he was out by half a metre.

Showing those extra two numbers is pointless if they're wrong. All that matters is how many volts difference between the actual voltage and what is shown.

Reply to
Commander Kinsey

You keep saying that it's only the accuracy that matters. That's true to some - and only some - extent.

Now let's compare two different hypothetical meters, both 100% accurate. Let's say that meter A has 3.5 digits (max count 1999) and meter B is 4.5 digits (19999). Use them to measure a battery cell of exactly 1.612345V.

Meter A will display 1.612V whereas meter B will show 1.6123V. Meter B allows you to evaluate the result to a higher degree of precision.

Further suppose that both meters are not perfectly accurate and read 1% low. A will show 1.596V while B will read 1.5962V. B is still more precise in showing you what it thinks the voltage is. An order of magnitude more precise, in fact, even though both meters are off by -1%.

That's how the term 'precision' is used in engineering. Perhaps what's confusing you is the fact that the term is more loosely applied in everyday language.

As to the +/- 3 count (or 1 or whatever) possible error, it's an

*uncertainty*, not a fixed inaccuracy, in digitizing an analog quantity. It will take too long to explain in detail here. Let me put it this way: If you measure the example voltage above multiple times with a meter with +/-3 count uncertainty, you may get a reading that varies from measurement to measurement by up to 6 points in the last digit. That's not a percentage inaccuracy.
Reply to
Pimpom

Sometimes it is precision.

I worked at a company making polyester from raw materials. In a room was a panel with about 10 temperature gauges. At a certain time all gauges were marked and a sample of the material was sent to the lab. If it came back good, then the object was to keep all the gauges on the mark. It did not matter how far off the gauges were from the actual temperature. No mater how well we calibrated the guages there were several other factors that we had no control over. Such as the thermocouples they were connected to. The specifications were +- 3 deg C. on the thermocouples from the factory. If the temperature varied more than 1 deg C at 300 deg C it could mess up the material.

So the object was precision and not accuracy.

Reply to
Ralph Mowery

On 2020-06-20 17:58, Commander Kinsey wrote: [...]

Engineers distinguish between accuracy, a measure of how close a observed value is to the true value, and resolution, which is a measure of the device's ability to resolve small changes. Either specification is useful in its own right, and professional instrumentation will always have both specs. So even if the last digit or two of a measuring device are not accurate, they may still be useful.

You may want to check audio ADCs and DACs for example, which have atrocious accuracy, but excellent resolution. An example of the opposite might be a voltage reference, which has excellent accuracy, but no resolution at all.

Of course in general, there is a tendency of accurate instruments to have a better resolution too.

Jeroen Belleman

Reply to
Jeroen Belleman

I once worked for a company that made an instrument that measured cable attenuation to 0.001dB +- 0.1dB. The customers didn't care about the 0.1dB, since all they were interested in was the /stability/ of the 0.001dB and the ability to measure small changes.

Why? Because the instrument measured the attenuation change as a function of temperature, and each temperature cycle test took 7 days. Yes, it was a /large/ drum of undersea cable.

Reply to
Tom Gardner

Hi,

I'll try to explain that with a simplified model of a digital meter (please everybody correct me if it's oversimplified and wrong):

The typical digital meter consist of some kind of processing of the signal to be measured and an A/D-converter that converts it's analog input signal to a number that is displayed.

The input processing serves to transform the quantity to be measured into an analog signal that is properly adapted to cover the range of possible input signals of the A/D-converter.

Let's say we want to measure an AC current of 10A with a clamp meter like yours. Let the display of the meter have 3 1/3 digits, so the range of displayable numbers goes from 0000 to 1999 with an additional decimal point somewhere.

The A/D-converter will not be able to directly convert a 10A current, so we pickup the current to be measured with a transformer, the wire carrying your 10A current being the primary and a coil internal to the clamp assembly being the secondary winding.

An AC current flow through the wire will induce an AC voltage in the secondary winding. Since the A/D-converter may not directly accept AC voltages, further processing may be required, such as amplification or voltage division and e.g. True-RMS detection of the AC voltage. All this processing will end up in a voltage that is suitable for the A/D-converter - say, 1V DC for 10A of AC current.

All the (analog) signal processing described here will not be free of unwanted influences and processing errors. The transformer at the input could e.g. pick up unwanted magnetic fields, the amplifier could exhibit noise and nonlinearities, the TRMS detection could exhibit some errors.

All these error sources or influences may be described in the meter's specification as a percentage - e.g. the +/- 1.9% you mentioned.

Now, the A/D converter converts the analog input voltage into a number. One method to do this, when speed is not a critical factor, is (dual) slope integration.

Let's assume for a moment that the input voltage is static, i.e. the 1V DC mentioned before.

Basically, the conversion works by comparing the input voltage (to be measured) to a linearly rising voltage (ramp). Similar to a stopwatch, a counter starts when the reference voltage begins to rise and a comparator stops it when the ramp voltage is equal to the input voltage.

In our example with 1V input, the counter may stop at a count of 1000. With the knowledge that, by means of the input processing and the calibration of the meter, this corresponds to 10A AC current. The meter would probably display 10.00 (A).

But: At some time in the process, the counter will switch from 999 to 1000 in a very short (almost zero) time. That means, that the input voltage may just be a tiny little bit less and the counter is stopped at 999, not at 1000.

That means, that for any input signal, you always have +/-1 digit display uncertainty because you cannot know whether the counter maybe was just before switching to the next count.

With a specification of +/-3 digits, the A/D converter has a greater uncertainty when counting. For example, even at a constant input of 1V, the internal counter may be less precise and stop at 997, 998, 999, 1000, 1001, 1002 or 1003, even if the input signal doesn't change. You can think of this as a stopwatch that may be off some counts each time you make a measurement.

This type of error is not related to the input signal processing, so it is not very meaningful to express the error as a percentage of the measured value. It is usually expressed as a number of digits, because the error is mainly caused by the process of converting input signals to numbers.

Of course, I know that this very simple single slope integration is not used in meters, dual slope is the least you can do. Also, the A/D conversion may contribute to the percentage error spec. The (over)simplification is just a means to explain why there are two numbers in the specification.

Just my two cents,

Dieter

Reply to
Dieter Michel

What if your caliper had a resolution of 1 mil +/- 3 counts on the last digit? That's the issue with multimeters that have completely bogus digits at the end. Those number are just noise and serve no purpose at all. They don't even compare to all bullets missing the target but landing in the same wrong spot.

Reply to
Cydrome Leader

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.