ACDC Express

With Multimeters, both Accuracy and Precision Matter

While we might think that accurate and precise mean the same thing, in science and engineering there is a very important difference. Let’s look at why you want your multimeter to be both precise and accurate, and how they aren’t linked.

Many multimeters will advertise that they are accurate and precise, but what does that really mean? In everyday usage, precise and accurate are interchangeable synonyms. But if you are working with electronics as an engineer or electrician, the distinction between the two is very important. If your measurements are accurate but not precise, you could end up damaging equipment! Let’s look at these two measures and why you want a multimeter that offers you both precision and accuracy.


The accuracy of a measurement is how close it gets to the true value. For multimeters, this accuracy is displayed as a percentage of how close it is to the true value. If a multimeter is accurate to within 2%, then you know that the measurement is within 2% above or below the real value. Analogue multimeters were often accurate to within 3%. For many applications, this was enough. But with specialist equipment, such as automotive or medical instruments, much higher accuracy is required.

The accuracy of a multimeter can be written in several ways:

  • Accuracy = ±(ppm of reading + ppm of range)
  • Accuracy = (% Reading) + (% Range)
  • Accuracy = (% Reading) + Offset

Each manufacturer makes use of a slightly different format for expressing accuracy, which can make it difficult to compare instruments between different manufacturers.

For a quick example, a digital multimeter with an accuracy of ±2% gives a reading of 100.0V. The actual value could be anywhere from 98.0V to 102.0V. However, if a multimeter lists its accuracy as ±(2%+2), then in the same test a reading of 100.0V is somewhere between 97.8V and 102.2V. This is due to the 2% change and then an offset of 2 on the lowest number (in this case, 0.2V).

Fluke’s handheld digital multimeters offer DC accuracy between 0.5% and 0.025%. This means that a reading of 100.0V could be 99.5V to 100.5V for the first type, or 99.975V to 100.025V for the most accurate multimeter!

While in everyday language precise and accurate are synonyms, the difference between the two is very important for technical fields. We discuss the differences and why you want your multimeter to be both.


Precision is a description of how close the measurements are to each other. A precise tool will give repeated or reproducible measurements if conditions remain unchanged. If you measure the same thing five times with the exact same result, then you know your system is very precise. (However, it could be inaccurate!) 

One of the simplest examples to show accuracy versus precision is a rifle at a shooting range. In this example, the rifle is aimed at the bullseye and shot from the same position every time. If the bullet holes are tightly positioned near the bullseye, then the rifle (or maybe the shooter) is precise but inaccurate. If all the bullets hit the bullseye, then the rifle is accurate and precise. If the bullet holes are roughly an equal distance from the bullseye, but spread around the whole target, then the rifle is accurate, but not precise. If the bullet holes are all over the target, the shooter is neither accurate nor precise as the results can’t be replicated.

Multimeters make use of some other interesting technical terminology in their testing. For example, the resolution expresses the smallest possible increment they can detect. If a multimeter has a resolution of 1mV, then it will be able to detect a 1mV change during a test. That would be one-thousandth of a volt. 

Multimeters also make use of a range. Older models would often get blown by testing something with the set range being too low. Digital multimeters tend to offer an autorange function that finds the appropriate magnitude for testing purposes. By using the lowest possible range setting without overloading the multimeter you will have a better resolution and see a more accurate result. 

A multimeter’s display has a specification called counts. The higher the count, the better the resolution. A 6000-count multimeter cannot display a number larger than 5999 on its display, which is important if the value being tested is close to this number.

Some multimeters offer a high-resolution mode. This can increase the counts, but the trade-off is a much longer testing period as you will have to wait for the digits to the far-right of the display to “settle”. This can increase the time between readings, which might not be beneficial for multiple tests. 

When looking for faults, such a high level of accuracy and precision isn’t required as the errors will be evident in approximate values. But in those cases where finding the actual value is important (eg. medical equipment), this level of accuracy and precision could be life-saving or prevent damage to expensive, delicate machinery.

Keeping it accurate (and precise!)

To ensure your multimeter is working optimally, a calibration test should be done regularly. It is also worth noting that the various sensors inside a multimeter can be affected by environmental conditions, as well as mishandling. Dropping a multimeter or damaging it can also affect the readings you are given.

Be careful of storing or operating a multimeter in extreme temperatures, as well as very high humidity levels. Like other electrical devices, a high altitude can also affect your readings as it puts the electric components under strain. If you often operate in harsh conditions, the quality of your multimeter will very quickly become apparent. For example, a cheaper multimeter might only be accurate between 0°C and 40°C!

For accurate, precise measurements in the harshest of conditions, we recommend Fluke’s range of digital multimeters.

Share this

Current Specials

Switch Competition

Follow Us

Find your nearest store

Click here to find the ACDC Express Store closest to you