**Q:** What is the difference between counts and digits? And what do they represent?

**A:** Multimeters and component testers often specify display resolution in counts or digits. It is typical to see values such as 20000 counts or 4 1/2 digits in specifications. Manufacturers often use counts, or digits, or both to specify their display resolution.

The following describes what counts and digits mean and how to convert between the two types.

__Counts__

Instruments that have display counts specified are usually multimeters and component testers. In both instrument types, there are different ranges that are used to obtain different values in measured readings. Counts tell you what the instrument can display before it changes to the next range.

For example, suppose a multimeter has 50000 counts. This means, the range changes when it hits 50000 on display. To illustrate, a 50000 count meter can read 49.999 V on the display, but when it tries to display 50 V, it will only read 050.00 V, or simply 50.00 V instead. The multimeter moves to the next range, which generally also means one digit of resolution will be lost.

Another example, suppose your LCR meter has 2000 counts. 0 to 19.99 can be read on display with two digits of resolution if you are trying to measure a component with 19.99 units. But if the reading is to be 20, then the display will show 20.0 instead, losing one digit of resolution as it changes its range. This is because the count is 2000, meaning when display hits 2000, it changes to the next range.

__Digits__

It is often shown on DMM specifications where resolution is described in digits. It is common to see 4.5 digits, 5.5 digits, 6.5 digits, and higher. When reading this, the whole number must be split up from the fraction.

For example, lets look at 4.5 digits, or 4 and 1/2 digits.

The "4" represents full digits, meaning digits that can be represented or shown from 0-9 on the display.

The "1/2", or "0.5" represent the first digit of the reading display and the maximum it can read. In this case, "1/2" digit means the first digit that can represent the digit 0-1.

Putting it together, 4.5 digits mean the display can show 00000 to 19999. This is because the first digit can show 0 and 1, and the last "4" proceeding digits can show 0-9. There is 4 because it is a 4 and a 1/2 digit meter. This means, if this was to be converted to count, it would be 20000 count, because 4.5 digits mean you can display 00000 to 19999. The next digit up would change it's range. Therefore, it would be 20000 count, as this would change it's range.

Another example, lets look at 3.75, or 3 and 3/4 digits.

Typically, 3/4 digit can represent a digit that can read from 0 to 3, and in some cases 0 to 5. Taking the same idea as the previous example, 3 whole digits mean 3 digits can represent 0-9, and the 3/4 represents the first digit from left. Putting this together, 3.75 digits can represent display showing 0000 to 3999 before range is changed. Meaning, an instrument with this digit resolution would have 4000 counts.

Some manufacturers also use 3/4 to represent digits that can display 0 to 5. In this case, 3.75 digits can also represent display resolution showing 0000 to 5999. Converting this to counts, it would be 6000 counts.

Because there are two different use of "3/4", it is important to look at specifications carefully, especially the range of the instrument. Some manufacturers do not specify counts, so therefore looking at the range would be the best way to determine how much display resolution you can obtain when making measurements. For example, if the range of the multimeter for DC voltage is 2V, 20V, 200V, 1000V, it can be assumed that most likely the meter has a 1/2 digit. If it was 5V, 50V, 500V, 1000V, it can be assumed that most likely the meter has a 3/4 digit.

Below is a table of common display resolution conversion between digits and counts:

Digits | Counts |
---|---|

4 1/2 (4.5) | 20000 |

4 3/4 (4.75) | 40000 (or 50000) |

5 1/2 (5.5) | 200000 |

6 1/2 (6.5) | 2000000 |